Patents Assigned to XRSPACE CO., LTD.
  • Patent number: 10817047
    Abstract: A tracking system is disclosed. The tracking system comprises a head-mounted display (HMD) worn on a head of a user and configured to virtualize a body movement of the user in a virtual environment; and a plurality of sensors worn on feet of the user configured to determine body information of the user according to the body movement of the user, and transmit the determined body information to the HMD; wherein the HMD virtualizes the body movement of the user according to the determined body information; wherein the body information is related to a plurality of mutual relationships between the plurality of sensors and the HMD.
    Type: Grant
    Filed: September 19, 2018
    Date of Patent: October 27, 2020
    Assignee: XRSpace CO., LTD.
    Inventors: Peter Chou, Chun-Wei Lin, Yi-Kang Hsieh, Chia-Wei Wu
  • Patent number: 10732725
    Abstract: A method of interactive display based on gesture recognition includes determining a plurality of gestures corresponding to a plurality of images, interpreting a predetermined combination of gestures among the plurality of gestures as a command, and displaying a scene in response to the command.
    Type: Grant
    Filed: September 25, 2018
    Date of Patent: August 4, 2020
    Assignee: XRSpace CO., LTD.
    Inventors: Peter Chou, Feng-Seng Chu, Yen-Hung Lin, Shih-Hao Ke, Jui-Chieh Chen
  • Publication number: 20200193667
    Abstract: An avatar facial expression generating system and a method of avatar facial expression generation are provided. In the method, multiple user data are obtained and related to the sensing result of a user from multiple data sources. Multiple first emotion decisions are determined, respectively, based on each user data. Whether an emotion collision occurs among the first emotion decisions is determined. The emotion collision is related that the corresponding emotion groups of the first emotion decisions are not matched with each other. A second emotion decision is determined from one or more emotion groups according to the determining result of the emotion collision. The first or second emotion decision is related to one emotion group. A facial expression of an avatar is generated based on the second emotion decision. Accordingly, a proper facial expression of the avatar could be presented.
    Type: Application
    Filed: February 27, 2020
    Publication date: June 18, 2020
    Applicant: XRSPACE CO., LTD.
    Inventors: Feng-Seng Chu, Peter Chou
  • Patent number: 10678342
    Abstract: A method of virtual user interface interaction based on gesture recognition comprises detecting two hands in a plurality of images, recognizing each hand's gesture, projecting a virtual user interface on an open gesture hand when one hand is recognized with a point gesture and the other hand is recognized with an open gesture, tracking an index fingertip of the point gesture hand, determining whether the index fingertip of the point gesture hand is close to the open gesture hand within a predefined rule, interpreting a movement of the index fingertip of the point gesture hand as a click command when the index fingertip of the point gesture hand is close to the open gesture hand within the predefined rule, and in response to the click command, generating image data with a character object of the virtual user interface object.
    Type: Grant
    Filed: October 21, 2018
    Date of Patent: June 9, 2020
    Assignee: XRSpace CO., LTD.
    Inventors: Peter Chou, Feng-Seng Chu, Yen-Hung Lin, Shih-Hao Ke, Jui-Chieh Chen
  • Patent number: 10650564
    Abstract: A method of generating 3D facial geometry for a computing device is disclosed. The method comprises obtaining a 2D image, performing a deep neural network, DNN, operation on the 2D image, to classify each of facial features of the 2D image as texture components and obtain probabilities that the facial feature belong to the texture components, wherein the texture components are represented by 3D face mesh and are predefined in the computing device, and generating a 3D facial model based on a 3D face template predefined in the computing device and the texture component with the highest probability.
    Type: Grant
    Filed: April 21, 2019
    Date of Patent: May 12, 2020
    Assignee: XRSpace CO., LTD.
    Inventors: Ting-Chieh Lin, Shih-Chieh Chou
  • Patent number: 10606345
    Abstract: A reality interactive responding system, comprising a first server configured to receive first input data from a user and to determine whether the first input data conform to any one of a plurality of variation conditions or not; and a second server coupled to the first server and configured to receive second input data from the first server when the first input data conform to any one of the plurality of variation conditions and to determine a plurality of interactions in response to the first input data from the user; wherein the first input data from the user are related to an action, a facial expression, a gaze, a text, a speech, a gesture, an emotion or a movement generated by the user.
    Type: Grant
    Filed: September 25, 2018
    Date of Patent: March 31, 2020
    Assignee: XRSpace CO., LTD.
    Inventors: Peter Chou, Feng-Seng Chu, Cheng-Wei Lee, Yu-Chen Lai, Chuan-Chang Wang
  • Publication number: 20200090394
    Abstract: An avatar facial expression generating system and a method of avatar facial expression generation are provided. In the method, user data relating to sensing result of a user is obtained. A first and a second emotional configurations are determined. The first and second emotional configuration maintain during a first and a second duration, respectively. A transition emotional configuration is determined based on the first emotional configuration and the second emotional configuration, in which the transition emotional configuration maintains during a third duration. Facial expressions of an avatar are generated based on the first emotional configuration, the transition emotional configuration, and the second emotional configuration, respectively. The third duration exists between the first duration and the second duration. Accordingly, a normal facial expression on an avatar would be presented while encountering the emotion transformation.
    Type: Application
    Filed: October 17, 2019
    Publication date: March 19, 2020
    Applicant: XRSPACE CO., LTD.
    Inventors: Wei-Zhe Hong, Ming-Yang Kung, Ting-Chieh Lin, Feng-Seng Chu
  • Publication number: 20200089940
    Abstract: A behavior understanding system and a behavior understanding method are provided. The behavior understanding system includes a sensor and a processor. The sensor senses a motion of a human body portion for a time period. A sequence of motion sensing data of the sensor is obtained. At least two comparing results respectively corresponding to at least two timepoints within the time period are generated according to the motion sensing data. The comparing result are generated through comparing the motion sensing data with base motion data. The base motion data is related to multiple base motions. A behavior information of the human body portion is determined according to the comparing results. The behavior information is related to a behavior formed by at least one of the base motions. Accordingly, the accuracy of behavior understanding can be improved, and the embodiments may predict the behavior quickly.
    Type: Application
    Filed: September 10, 2019
    Publication date: March 19, 2020
    Applicant: XRSPACE CO., LTD.
    Inventors: Yi-Kang Hsieh, Ching-Ning Huang, Chien-Chih Hsu
  • Patent number: D889464
    Type: Grant
    Filed: June 28, 2019
    Date of Patent: July 7, 2020
    Assignee: XRSPACE CO., LTD.
    Inventors: Chieh-Kai Wang, Yen-Hung Lin