Patents by Inventor William Burgar

William Burgar has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230005384
    Abstract: Raw motion data generated by sensors affixed to a user performing exercises or poses in response to perceiving content annotated with a motion track and presented to the user, the motion track generated using a posture dictionary is collected. The exercises or poses are determined from the raw motion data using the posture dictionary. Motion data indicating the exercises or poses within the raw motion data is generated using the raw motion data. The user is graded in performing the exercises or poses by comparing the motion data with the motion track. The content is presented to the user based on the grading of the user.
    Type: Application
    Filed: April 12, 2022
    Publication date: January 5, 2023
    Inventors: Steven Webster, Ross Arnott, William Burgar
  • Patent number: 11302214
    Abstract: Raw motion data generated by sensors affixed to a user performing exercises or poses in response to perceiving content annotated with a motion track and presented to the user, the motion track generated using a posture dictionary is collected. The exercises or poses are determined from the raw motion data using the posture dictionary. Motion data indicating the exercises or poses within the raw motion data is generated using the raw motion data. The user is graded in performing the exercises or poses by comparing the motion data with the motion track. The content is presented to the user based on the grading of the user.
    Type: Grant
    Filed: July 23, 2019
    Date of Patent: April 12, 2022
    Assignee: Asensei, Inc.
    Inventors: Steven Webster, Ross Arnott, William Burgar
  • Publication number: 20190347957
    Abstract: Raw motion data generated by sensors affixed to a user performing exercises or poses in response to perceiving content annotated with a motion track and presented to the user, the motion track generated using a posture dictionary is collected. The exercises or poses are determined from the raw motion data using the posture dictionary. Motion data indicating the exercises or poses within the raw motion data is generated using the raw motion data. The user is graded in performing the exercises or poses by comparing the motion data with the motion track. The content is presented to the user based on the grading of the user.
    Type: Application
    Filed: July 23, 2019
    Publication date: November 14, 2019
    Applicant: ASENSEI, INC.
    Inventors: Steven Webster, Ross Arnott, William Burgar
  • Patent number: 10360811
    Abstract: Raw motion data generated by sensors affixed to a user performing exercises or poses in response to perceiving content annotated with a motion track and presented to the user, the motion track generated using a posture dictionary is collected. The exercises or poses are determined from the raw motion data using the posture dictionary. Motion data indicating the exercises or poses within the raw motion data is generated using the raw motion data. The user is graded in performing the exercises or poses by comparing the motion data with the motion track. The content is presented to the user based on the grading of the user.
    Type: Grant
    Filed: January 5, 2016
    Date of Patent: July 23, 2019
    Assignee: ASENSEI, INC.
    Inventors: Steven Webster, Ross Arnott, William Burgar
  • Publication number: 20160193500
    Abstract: Raw motion data generated by sensors affixed to a user performing exercises or poses in response to perceiving content annotated with a motion track and presented to the user, the motion track generated using a posture dictionary is collected. The exercises or poses are determined from the raw motion data using the posture dictionary. Motion data indicating the exercises or poses within the raw motion data is generated using the raw motion data. The user is graded in performing the exercises or poses by comparing the motion data with the motion track. The content is presented to the user based on the grading of the user.
    Type: Application
    Filed: January 5, 2016
    Publication date: July 7, 2016
    Applicant: ASENSEI, INC.
    Inventors: Steven Webster, Ross Arnott, William Burgar