Patents by Inventor Haijun Xia
Haijun Xia has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230394241Abstract: A computer system for generating graphics content receives text or information specifying an amount of spoken language and uses NLP to extract linguistic structures associated with the text or the amount of spoken language to determine mappings between the linguistic structures and the graphics content based at least in part on a predefined grammar. The predefined grammar may specify a target context for matching to arguments associated with the linguistic structures and may specify one or more corresponding graphics elements having associated appearances, layouts and/or graphics effects. The computer system then generates the graphics content associated with the text or the amount of spoken language.Type: ApplicationFiled: October 14, 2021Publication date: December 7, 2023Inventor: Haijun Xia
-
Publication number: 20230342383Abstract: A method and system for managing workflows receives a text string being typed within a data document and executes a connection engine that performs natural language processing (NLP) to extract words and phrases having keywords corresponding to data operations, parse the text string into nested nodes including sub-phrases of arguments and keywords. The arguments and keywords are assembled into one or more complete data operation which is executed to return matching results from within a dataset as dependent phrase candidates to complete the text string. The writer selects a candidate from the dependent phrase candidates in response to which the connection engine creates a persistent text-data connection between the selected candidate and the dataset. This persistent text-data connection automatically updates the selected candidate when one or more of the dataset, arguments, and keywords are modified.Type: ApplicationFiled: April 21, 2023Publication date: October 26, 2023Inventor: Haijun Xia
-
Patent number: 10852849Abstract: A finger-mounted stylus for performing touch-based input on a touchscreen includes a fingertip case configured to attach to a user fingertip, an extension arm that is coupled to the fingertip case and includes a conductive tip, wherein the extension arm is configured to position the conductive tip away from the fingertip case, and control circuitry configured to apply an electric charge to the conductive tip when the conductive tip is in contact with or proximate to the touchscreen.Type: GrantFiled: May 6, 2016Date of Patent: December 1, 2020Assignee: AUTODESK, INC.Inventors: Tovi Grossman, George Fitzmaurice, Haijun Xia
-
Patent number: 10684758Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.Type: GrantFiled: February 20, 2017Date of Patent: June 16, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Haijun Xia
-
Patent number: 10592049Abstract: A system and method are disclosed for using a touch sensing system capable of sensing location of a finger or object above a touch surface to inform a touch response system in an electronic device of a predicted future user input event or motion data in advance of an actual touch event. Current user input is sensed via the touch sensing system and data reflecting hover information is created. A model of user interaction with a touch surface is applied to the data representative of the user input to create data reflecting a prediction of a future user input event. In an embodiment, prior to occurrence of the predicted user input event, a predicted location and a predicted time at which the predicted future user input event will occur are provided to a touch response system.Type: GrantFiled: June 14, 2018Date of Patent: March 17, 2020Assignee: Tactual Labs Co.Inventors: Clifton Forlines, Ricardo Jorge Jota Costa, Daniel Wigdor, Karan Singh, Haijun Xia
-
Patent number: 10592050Abstract: A system and method are disclosed for using a touch sensing system capable of sensing location of a finger or object above a touch surface to inform a touch response system in an electronic device of a predicted future user input event or motion data in advance of an actual touch event. Current user input is sensed via the touch sensing system and data reflecting hover information is created. A model of user interaction with a touch surface is applied to the data representative of the user input to create data reflecting a prediction of a future user input event. In an embodiment, prior to occurrence of the predicted user input event, a predicted location and a predicted time at which the predicted future user input event will occur are provided to a touch response system.Type: GrantFiled: June 14, 2018Date of Patent: March 17, 2020Assignee: Tactual Labs Co.Inventors: Clifton Forlines, Ricardo Jorge Jota Costa, Daniel Wigdor, Karan Singh, Haijun Xia
-
Patent number: 10558341Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.Type: GrantFiled: February 20, 2017Date of Patent: February 11, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Kenneth P. Hinckley, William Arthur Stewart Buxton, Michel Pahud, Haijun Xia
-
Patent number: 10466812Abstract: A finger-mounted stylus for performing touch-based input on a touchscreen includes a fingertip case configured to attach to a user fingertip, an extension arm that is coupled to the fingertip case and includes a conductive tip, wherein the extension arm is configured to position the conductive tip away from the fingertip case, and control circuitry configured to apply an electric charge to the conductive tip when the conductive tip is in contact with or proximate to the touchscreen.Type: GrantFiled: May 6, 2016Date of Patent: November 5, 2019Assignee: AUTODESK, INC.Inventors: Tovi Grossman, George Fitzmaurice, Haijun Xia
-
Publication number: 20180292945Abstract: A system and method are disclosed for using a touch sensing system capable of sensing location of a finger or object above a touch surface to inform a touch response system in an electronic device of a predicted future user input event or motion data in advance of an actual touch event. Current user input is sensed via the touch sensing system and data reflecting hover information is created. A model of user interaction with a touch surface is applied to the data representative of the user input to create data reflecting a prediction of a future user input event. In an embodiment, prior to occurrence of the predicted user input event, a predicted location and a predicted time at which the predicted future user input event will occur are provided to a touch response system.Type: ApplicationFiled: June 14, 2018Publication date: October 11, 2018Applicant: Tactual Labs Co.Inventors: Clifton Forlines, Ricardo Jorge Jota Costa, Daniel Wigdor, Karan Singh, Haijun Xia
-
Publication number: 20180292946Abstract: A system and method are disclosed for using a touch sensing system capable of sensing location of a finger or object above a touch surface to inform a touch response system in an electronic device of a predicted future user input event or motion data in advance of an actual touch event. Current user input is sensed via the touch sensing system and data reflecting hover information is created. A model of user interaction with a touch surface is applied to the data representative of the user input to create data reflecting a prediction of a future user input event. In an embodiment, prior to occurrence of the predicted user input event, a predicted location and a predicted time at which the predicted future user input event will occur are provided to a touch response system.Type: ApplicationFiled: June 14, 2018Publication date: October 11, 2018Applicant: Tactual Labs Co.Inventors: Clifton Forlines, Ricardo Jorge Jota Costa, Daniel Wigdor, Karan Singh, Haijun Xia
-
Patent number: 10088952Abstract: A system and method are disclosed for using a touch sensing system capable of sensing location of a finger or object above a touch surface to inform a touch response system in an electronic device of a predicted future user input event or motion data in advance of an actual touch event. Current user input is sensed via the touch sensing system and data reflecting hover information is created. A model of user interaction with a touch surface is applied to the data representative of the user input to create data reflecting a prediction of a future user input event. In an embodiment, prior to occurrence of the predicted user input event, a predicted location and a predicted time at which the predicted future user input event will occur are provided to a touch response system.Type: GrantFiled: September 18, 2015Date of Patent: October 2, 2018Assignee: TACTUAL LABS CO.Inventors: Clifton Forlines, Ricardo Jorge Jota Costa, Daniel Wigdor, Karan Singh, Haijun Xia
-
Publication number: 20180239520Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.Type: ApplicationFiled: February 20, 2017Publication date: August 23, 2018Applicant: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, William Arthur Stewart Buxton, Michel Pahud, Haijun Xia
-
Publication number: 20180239519Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.Type: ApplicationFiled: February 20, 2017Publication date: August 23, 2018Applicant: Microsoft Technology Licensing, LLCInventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Haijun Xia
-
Publication number: 20170031468Abstract: A finger-mounted stylus for performing touch-based input on a touchscreen includes a fingertip case configured to attach to a user fingertip, an extension arm that is coupled to the fingertip case and includes a conductive tip, wherein the extension arm is configured to position the conductive tip away from the fingertip case, and control circuitry configured to apply an electric charge to the conductive tip when the conductive tip is in contact with or proximate to the touchscreen.Type: ApplicationFiled: May 6, 2016Publication date: February 2, 2017Inventors: Tovi GROSSMAN, George FITZMAURICE, Haijun XIA
-
Publication number: 20170031469Abstract: A finger-mounted stylus for performing touch-based input on a touchscreen includes a fingertip case configured to attach to a user fingertip, an extension arm that is coupled to the fingertip case and includes a conductive tip, wherein the extension arm is configured to position the conductive tip away from the fingertip case, and control circuitry configured to apply an electric charge to the conductive tip when the conductive tip is in contact with or proximate to the touchscreen.Type: ApplicationFiled: May 6, 2016Publication date: February 2, 2017Inventors: Tovi GROSSMAN, George FITZMAURICE, Haijun XIA
-
Publication number: 20160188112Abstract: A system and method are disclosed for using a touch sensing system capable of sensing location of a finger or object above a touch surface to inform a touch response system in an electronic device of a predicted future user input event or motion data in advance of an actual touch event. Current user input is sensed via the touch sensing system and data reflecting hover information is created. A model of user interaction with a touch surface is applied to the data representative of the user input to create data reflecting a prediction of a future user input event. In an embodiment, prior to occurrence of the predicted user input event, a predicted location and a predicted time at which the predicted future user input event will occur are provided to a touch response system.Type: ApplicationFiled: September 18, 2015Publication date: June 30, 2016Applicant: TACTUAL LABS CO.Inventors: Clifton Forlines, Ricardo Jorge Jota Costa, Daniel Wigdor, Karan Singh, Haijun Xia