Patents by Inventor Haijun Xia

Haijun Xia has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230394241
    Abstract: A computer system for generating graphics content receives text or information specifying an amount of spoken language and uses NLP to extract linguistic structures associated with the text or the amount of spoken language to determine mappings between the linguistic structures and the graphics content based at least in part on a predefined grammar. The predefined grammar may specify a target context for matching to arguments associated with the linguistic structures and may specify one or more corresponding graphics elements having associated appearances, layouts and/or graphics effects. The computer system then generates the graphics content associated with the text or the amount of spoken language.
    Type: Application
    Filed: October 14, 2021
    Publication date: December 7, 2023
    Inventor: Haijun Xia
  • Publication number: 20230342383
    Abstract: A method and system for managing workflows receives a text string being typed within a data document and executes a connection engine that performs natural language processing (NLP) to extract words and phrases having keywords corresponding to data operations, parse the text string into nested nodes including sub-phrases of arguments and keywords. The arguments and keywords are assembled into one or more complete data operation which is executed to return matching results from within a dataset as dependent phrase candidates to complete the text string. The writer selects a candidate from the dependent phrase candidates in response to which the connection engine creates a persistent text-data connection between the selected candidate and the dataset. This persistent text-data connection automatically updates the selected candidate when one or more of the dataset, arguments, and keywords are modified.
    Type: Application
    Filed: April 21, 2023
    Publication date: October 26, 2023
    Inventor: Haijun Xia
  • Patent number: 10852849
    Abstract: A finger-mounted stylus for performing touch-based input on a touchscreen includes a fingertip case configured to attach to a user fingertip, an extension arm that is coupled to the fingertip case and includes a conductive tip, wherein the extension arm is configured to position the conductive tip away from the fingertip case, and control circuitry configured to apply an electric charge to the conductive tip when the conductive tip is in contact with or proximate to the touchscreen.
    Type: Grant
    Filed: May 6, 2016
    Date of Patent: December 1, 2020
    Assignee: AUTODESK, INC.
    Inventors: Tovi Grossman, George Fitzmaurice, Haijun Xia
  • Patent number: 10684758
    Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.
    Type: Grant
    Filed: February 20, 2017
    Date of Patent: June 16, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Haijun Xia
  • Patent number: 10592050
    Abstract: A system and method are disclosed for using a touch sensing system capable of sensing location of a finger or object above a touch surface to inform a touch response system in an electronic device of a predicted future user input event or motion data in advance of an actual touch event. Current user input is sensed via the touch sensing system and data reflecting hover information is created. A model of user interaction with a touch surface is applied to the data representative of the user input to create data reflecting a prediction of a future user input event. In an embodiment, prior to occurrence of the predicted user input event, a predicted location and a predicted time at which the predicted future user input event will occur are provided to a touch response system.
    Type: Grant
    Filed: June 14, 2018
    Date of Patent: March 17, 2020
    Assignee: Tactual Labs Co.
    Inventors: Clifton Forlines, Ricardo Jorge Jota Costa, Daniel Wigdor, Karan Singh, Haijun Xia
  • Patent number: 10592049
    Abstract: A system and method are disclosed for using a touch sensing system capable of sensing location of a finger or object above a touch surface to inform a touch response system in an electronic device of a predicted future user input event or motion data in advance of an actual touch event. Current user input is sensed via the touch sensing system and data reflecting hover information is created. A model of user interaction with a touch surface is applied to the data representative of the user input to create data reflecting a prediction of a future user input event. In an embodiment, prior to occurrence of the predicted user input event, a predicted location and a predicted time at which the predicted future user input event will occur are provided to a touch response system.
    Type: Grant
    Filed: June 14, 2018
    Date of Patent: March 17, 2020
    Assignee: Tactual Labs Co.
    Inventors: Clifton Forlines, Ricardo Jorge Jota Costa, Daniel Wigdor, Karan Singh, Haijun Xia
  • Patent number: 10558341
    Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.
    Type: Grant
    Filed: February 20, 2017
    Date of Patent: February 11, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kenneth P. Hinckley, William Arthur Stewart Buxton, Michel Pahud, Haijun Xia
  • Patent number: 10466812
    Abstract: A finger-mounted stylus for performing touch-based input on a touchscreen includes a fingertip case configured to attach to a user fingertip, an extension arm that is coupled to the fingertip case and includes a conductive tip, wherein the extension arm is configured to position the conductive tip away from the fingertip case, and control circuitry configured to apply an electric charge to the conductive tip when the conductive tip is in contact with or proximate to the touchscreen.
    Type: Grant
    Filed: May 6, 2016
    Date of Patent: November 5, 2019
    Assignee: AUTODESK, INC.
    Inventors: Tovi Grossman, George Fitzmaurice, Haijun Xia
  • Publication number: 20180292945
    Abstract: A system and method are disclosed for using a touch sensing system capable of sensing location of a finger or object above a touch surface to inform a touch response system in an electronic device of a predicted future user input event or motion data in advance of an actual touch event. Current user input is sensed via the touch sensing system and data reflecting hover information is created. A model of user interaction with a touch surface is applied to the data representative of the user input to create data reflecting a prediction of a future user input event. In an embodiment, prior to occurrence of the predicted user input event, a predicted location and a predicted time at which the predicted future user input event will occur are provided to a touch response system.
    Type: Application
    Filed: June 14, 2018
    Publication date: October 11, 2018
    Applicant: Tactual Labs Co.
    Inventors: Clifton Forlines, Ricardo Jorge Jota Costa, Daniel Wigdor, Karan Singh, Haijun Xia
  • Publication number: 20180292946
    Abstract: A system and method are disclosed for using a touch sensing system capable of sensing location of a finger or object above a touch surface to inform a touch response system in an electronic device of a predicted future user input event or motion data in advance of an actual touch event. Current user input is sensed via the touch sensing system and data reflecting hover information is created. A model of user interaction with a touch surface is applied to the data representative of the user input to create data reflecting a prediction of a future user input event. In an embodiment, prior to occurrence of the predicted user input event, a predicted location and a predicted time at which the predicted future user input event will occur are provided to a touch response system.
    Type: Application
    Filed: June 14, 2018
    Publication date: October 11, 2018
    Applicant: Tactual Labs Co.
    Inventors: Clifton Forlines, Ricardo Jorge Jota Costa, Daniel Wigdor, Karan Singh, Haijun Xia
  • Patent number: 10088952
    Abstract: A system and method are disclosed for using a touch sensing system capable of sensing location of a finger or object above a touch surface to inform a touch response system in an electronic device of a predicted future user input event or motion data in advance of an actual touch event. Current user input is sensed via the touch sensing system and data reflecting hover information is created. A model of user interaction with a touch surface is applied to the data representative of the user input to create data reflecting a prediction of a future user input event. In an embodiment, prior to occurrence of the predicted user input event, a predicted location and a predicted time at which the predicted future user input event will occur are provided to a touch response system.
    Type: Grant
    Filed: September 18, 2015
    Date of Patent: October 2, 2018
    Assignee: TACTUAL LABS CO.
    Inventors: Clifton Forlines, Ricardo Jorge Jota Costa, Daniel Wigdor, Karan Singh, Haijun Xia
  • Publication number: 20180239520
    Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.
    Type: Application
    Filed: February 20, 2017
    Publication date: August 23, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, William Arthur Stewart Buxton, Michel Pahud, Haijun Xia
  • Publication number: 20180239519
    Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.
    Type: Application
    Filed: February 20, 2017
    Publication date: August 23, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Haijun Xia
  • Publication number: 20170031469
    Abstract: A finger-mounted stylus for performing touch-based input on a touchscreen includes a fingertip case configured to attach to a user fingertip, an extension arm that is coupled to the fingertip case and includes a conductive tip, wherein the extension arm is configured to position the conductive tip away from the fingertip case, and control circuitry configured to apply an electric charge to the conductive tip when the conductive tip is in contact with or proximate to the touchscreen.
    Type: Application
    Filed: May 6, 2016
    Publication date: February 2, 2017
    Inventors: Tovi GROSSMAN, George FITZMAURICE, Haijun XIA
  • Publication number: 20170031468
    Abstract: A finger-mounted stylus for performing touch-based input on a touchscreen includes a fingertip case configured to attach to a user fingertip, an extension arm that is coupled to the fingertip case and includes a conductive tip, wherein the extension arm is configured to position the conductive tip away from the fingertip case, and control circuitry configured to apply an electric charge to the conductive tip when the conductive tip is in contact with or proximate to the touchscreen.
    Type: Application
    Filed: May 6, 2016
    Publication date: February 2, 2017
    Inventors: Tovi GROSSMAN, George FITZMAURICE, Haijun XIA
  • Publication number: 20160188112
    Abstract: A system and method are disclosed for using a touch sensing system capable of sensing location of a finger or object above a touch surface to inform a touch response system in an electronic device of a predicted future user input event or motion data in advance of an actual touch event. Current user input is sensed via the touch sensing system and data reflecting hover information is created. A model of user interaction with a touch surface is applied to the data representative of the user input to create data reflecting a prediction of a future user input event. In an embodiment, prior to occurrence of the predicted user input event, a predicted location and a predicted time at which the predicted future user input event will occur are provided to a touch response system.
    Type: Application
    Filed: September 18, 2015
    Publication date: June 30, 2016
    Applicant: TACTUAL LABS CO.
    Inventors: Clifton Forlines, Ricardo Jorge Jota Costa, Daniel Wigdor, Karan Singh, Haijun Xia