Patents by Inventor Dan Hwang

Dan Hwang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10521105
    Abstract: Example apparatus and methods concern establishing, managing, or dis-establishing a primary hover-point for a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may identify where the object is located, how the object is moving, what the object is doing, or other attributes of the object. The apparatus may assign a hover point designation to the object as a function of the characterization data. The apparatus selectively controls input actions associated with the object based on the hover point designation. The apparatus may accept input actions associated with a primary hover point and ignore actions associated with a non-primary hover point.
    Type: Grant
    Filed: June 14, 2018
    Date of Patent: December 31, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Lynn Dai, Dan Hwang
  • Publication number: 20190138178
    Abstract: Example apparatus and methods concern establishing, managing, or dis-establishing a primary hover-point for a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may identify where the object is located, how the object is moving, what the object is doing, or other attributes of the object. The apparatus may assign a hover point designation to the object as a function of the characterization data. The apparatus selectively controls input actions associated with the object based on the hover point designation. The apparatus may accept input actions associated with a primary hover point and ignore actions associated with a non-primary hover point.
    Type: Application
    Filed: June 14, 2018
    Publication date: May 9, 2019
    Inventors: Lynn DAI, Dan HWANG
  • Patent number: 10120568
    Abstract: Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state.
    Type: Grant
    Filed: October 8, 2015
    Date of Patent: November 6, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Lynn Dai, Dan Hwang, Bo-June Hsu, Raymond Quan, Eric Badger, Jose Rodriguez, Peter Gregory Davis
  • Patent number: 10025489
    Abstract: Example apparatus and methods concern establishing, managing, or dis-establishing a primary hover-point for a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may identify where the object is located, how the object is moving, what the object is doing, or other attributes of the object. The apparatus may assign a hover point designation to the object as a function of the characterization data. The apparatus selectively controls input actions associated with the object based on the hover point designation. The apparatus may accept input actions associated with a primary hover point and ignore actions associated with a non-primary hover point.
    Type: Grant
    Filed: September 16, 2013
    Date of Patent: July 17, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Lynn Dai, Dan Hwang
  • Patent number: 9262012
    Abstract: Example apparatus and methods concern detecting an angle at which an object is interacting with a hover-sensitive input/output interface. An example apparatus may include a proximity detector configured to detect an object in a hover-space associated with the hover-sensitive input/output interface. The proximity detector may provide three dimensional position information for the object (e.g., x,y,z). The angle may be determined from a first (x,y,z) measurement associated with a first portion (e.g., tip) of the object and a second (x,y,z) measurement associated with a second portion (e.g., end) of the object. The position of the object may determine a hover point on the interface while the position and angle may determine an intersection point on the interface. User interface elements or other information displayed on the interface may be manipulated based, at least in part, on the intersection point. Multiple objects interacting at multiple angles may be detected and responded to.
    Type: Grant
    Filed: January 3, 2014
    Date of Patent: February 16, 2016
    Inventors: Dan Hwang, Lynn Dai, Muhammad Usman
  • Publication number: 20160026385
    Abstract: Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state.
    Type: Application
    Filed: October 8, 2015
    Publication date: January 28, 2016
    Inventors: Lynn Dai, Dan Hwang, Bo-June Hsu, Raymond Quan, Eric Badger, Jose Rodriguez, Peter Gregory Davis
  • Patent number: 9170736
    Abstract: Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state.
    Type: Grant
    Filed: September 16, 2013
    Date of Patent: October 27, 2015
    Inventors: Lynn Dai, Dan Hwang, Bo-June Hsu, Raymond Quan, Eric Badger, Jose Rodriguez, Peter Gregory Davis
  • Publication number: 20150231491
    Abstract: Example apparatus and methods provide a virtual control for a video game played on a hover-sensitive device. A method may establish a hover point for an object located in a hover space produced by the device. The hover point may be associated with a three dimensional virtual joystick that may or may not be displayed. The virtual joystick processes inputs from the three dimensional hover space. The inputs have a z component. The z component may characterize, for example, a distance between the object and the device, or a rate at which the object is approaching or moving away from the device. The video game may be controlled in a third dimension based on the z component. For example, a character may crouch down or stand up based on the z component, or the area of a spell may be expanded or contracted based on the z component.
    Type: Application
    Filed: February 19, 2014
    Publication date: August 20, 2015
    Applicant: Microsoft Corporation
    Inventors: Dan Hwang, Lynn Dai
  • Publication number: 20150234468
    Abstract: Example apparatus and methods support interactions between a hover-sensitive apparatus and other apparatus. A hover action performed in the hover space of one apparatus can control that apparatus or another apparatus. The interactions may depend on the positions of the apparatus. For example, a user may virtually pick up an item on a first hover-sensitive apparatus and virtually toss it to another apparatus using a hover gesture. A directional gesture may selectively send content to a target apparatus while a directionless gesture may send content to a distribution list or to any apparatus in range. A shared display may be produced for multiple interconnected devices and coordinated information may be presented on the shared display. For example, a chessboard that spans two smartphones may be displayed and a hover gesture may virtually lift a chess piece from one of the displays and deposit it on another of the displays.
    Type: Application
    Filed: February 19, 2014
    Publication date: August 20, 2015
    Applicant: Microsoft Corporation
    Inventors: Dan Hwang, Lynn Dai
  • Publication number: 20150205400
    Abstract: Example apparatus and methods detect how a portable (e.g., handheld) device (e.g., phone, tablet) is gripped (e.g., held, supported). Detecting the grip may include detecting and characterizing touch points for fingers, thumbs, palms, or surfaces that are involved in supporting and positioning the apparatus. Example apparatus and methods may determine whether and how an apparatus is being held and then may exercise control based on the grip detection. For example, a display on an input/output interface may be reconfigured, physical controls (e.g., push buttons) on the apparatus may be remapped, user interface elements may be repositioned, resized, or repurposed, portions of the input/output interface may be desensitized or hyper-sensitized, virtual controls may be remapped, or other actions may be taken. Touch sensors may detect the pressure with which a smart phone is being gripped and produce control events (e.g., on/off, louder/quieter, brighter/dimmer, press and hold) based on the pressure.
    Type: Application
    Filed: January 21, 2014
    Publication date: July 23, 2015
    Applicant: MICROSOFT CORPORATION
    Inventors: Dan Hwang, Muhammad Usman, Scott Greenlay, Moshe Sapir
  • Publication number: 20150199030
    Abstract: Example apparatus and methods concern a first device (e.g., phone, tablet) having a touch and hover-sensitive display. The first device may detect a second device (e.g., television, monitor) that has a second display. After establishing a communication link and a context between the first and second device, the first device may provide a first output (e.g., movie, game) to be displayed on the second device. In response to identifying a hover point produced in a hover space associated with the first device, the first device may provide a second output (e.g., user interface element, cursor) for display on the second display. The second output may be based on the context and on a hover action associated with the hover point. The user may then cause a control event to be generated by interacting with the second display using the second output in relation to the cursor.
    Type: Application
    Filed: January 10, 2014
    Publication date: July 16, 2015
    Applicant: Microsoft Corporation
    Inventors: Petteri Mikkola, Dan Hwang
  • Publication number: 20150193040
    Abstract: Example apparatus and methods concern detecting an angle at which an object is interacting with a hover-sensitive input/output interface. An example apparatus may include a proximity detector configured to detect an object in a hover-space associated with the hover-sensitive input/output interface. The proximity detector may provide three dimensional position information for the object (e.g., x,y,z). The angle may be determined from a first (x,y,z) measurement associated with a first portion (e.g., tip) of the object and a second (x,y,z) measurement associated with a second portion (e.g., end) of the object. The position of the object may determine a hover point on the interface while the position and angle may determine an intersection point on the interface. User interface elements or other information displayed on the interface may be manipulated based, at least in part, on the intersection point. Multiple objects interacting at multiple angles may be detected and responded to.
    Type: Application
    Filed: January 3, 2014
    Publication date: July 9, 2015
    Applicant: Microsoft Corporation
    Inventors: Dan Hwang, Lynn Dai, Muhammad Usman
  • Publication number: 20150177866
    Abstract: Example apparatus and methods concern detecting and responding to a multiple hover point gesture performed for a hover-sensitive device. An example apparatus may include a hover-sensitive input/output interface configured to detect multiple objects in a hover-space associated with the hover-sensitive input/output interface. The apparatus may include logics configured to identify an object in the hover-space, to characterize an object in the hover-space, to track an object in the hover-space, to identify a multiple hover point gesture based on the identification, characterization, and tracking, and to control a device, application, interface, or object based on the multiple hover point gesture. In different embodiments, multiple hover point gestures may be performed in one, two, three, or four dimensions. In one embodiment, the apparatus may be event driven with respect to handling gestures.
    Type: Application
    Filed: December 23, 2013
    Publication date: June 25, 2015
    Inventors: Dan Hwang, Scott Greenlay, Christopher Fellowes, Bob Schriver
  • Publication number: 20150160819
    Abstract: Example apparatus and methods concern detecting and responding to a crane gesture performed for a touch or hover-sensitive device. An example apparatus may include a hover-sensitive input/output interface configured to display an object that can be manipulated using a crane gesture. The apparatus may include a proximity detector configured to detect an object in a hover-space associated with the hover-sensitive input/output interface. The apparatus may include fogies configured to change a state of the object from untouched to target to pinched to lifted to released in response to detecting the appearance and movement of bracket points. The appearance of the object may change in response to detecting the state changes.
    Type: Application
    Filed: December 6, 2013
    Publication date: June 11, 2015
    Inventors: Dan Hwang, Scott Greenlay, Christopher Fellows, Thamer Abanami, Jose Rodriguez, Joe Tobens
  • Publication number: 20150077338
    Abstract: Example apparatus and methods concern establishing, managing, or dis-establishing a primary hover-point for a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may identify where the object is located, how the object is moving, what the object is doing, or other attributes of the object. The apparatus may assign a hover point designation to the object as a function of the characterization data. The apparatus selectively controls input actions associated with the object based on the hover point designation. The apparatus may accept input actions associated with a primary hover point and ignore actions associated with a non-primary hover point.
    Type: Application
    Filed: September 16, 2013
    Publication date: March 19, 2015
    Applicant: MICROSOFT CORPORATION
    Inventors: Lynn Dai, Dan Hwang
  • Publication number: 20150082216
    Abstract: Example apparatus and methods concern controlling a hover-sensitive input/output interface. One example apparatus includes a proximity detector that detects an object in a hover-space associated with the input/output interface. The apparatus produces characterization data concerning the object. The characterization data may be independent of where in the hover-space the object is located. The apparatus selectively controls the activation, display, and deactivation of user interface elements displayed by the apparatus on the input/output interface as a function of the characterization data and interface state.
    Type: Application
    Filed: September 16, 2013
    Publication date: March 19, 2015
    Applicant: Microsoft Corporation
    Inventors: Lynn Dai, Dan Hwang, Paul Hsu, Raymond Quan, Eric Badger, Jose Rodriguez, Peter Gregory Davis
  • Publication number: 20150077345
    Abstract: Example apparatus and methods concern processing simultaneous touch and hover actions for a touch-sensitive and hover-sensitive input/output (i/o) interface. One example apparatus includes a touch detector that detects an object that touches the i/o interface. The example apparatus includes a proximity detector that detects an object in a hover-space associated with the i/o interface. The apparatus produces characterization data concerning the touch action and the hover action. The proximity detector and the touch detector may share a set of capacitive sensing nodes. Example apparatus and methods selectively control input/output actions on the i/o interface based on a combination of the touch action(s) and the hover action(s).
    Type: Application
    Filed: September 16, 2013
    Publication date: March 19, 2015
    Applicant: Microsoft Corporation
    Inventors: Dan Hwang, Lynn Dai
  • Patent number: D833350
    Type: Grant
    Filed: February 26, 2016
    Date of Patent: November 13, 2018
    Assignee: DENSO International America, Inc.
    Inventors: Heather Windel, Gareth Webb, Ronald Clogg, Feng Xue, Robert Wunsche, III, Dan Hwang, Ronald Woo