Patents by Inventor Cecylia Wati
Cecylia Wati has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240135617Abstract: Techniques for a motion-based online interactive platform are described. The platform makes a motion-based online class realistic and allows a teacher to visualize motions performed by a student in a perspective and how close the motion is in view of an authoritative instructor (model). Depending on implementation, the platform may be implemented as an application, a Teacher App or a student App. Each may be executed in a computer or control computer associated with an instructor or teacher or computing devices associated with students. Each of the computing devices is coupled to or includes a camera, where the camera is used by a student to show his presence or poses he performs. Data streams from the computing devices are received in the control computer, where each of the data streams includes a video and a set of sensing data. A 3D avatar of a student is generated from the sensing data in the control computer. The video is not used for generating the avatar.Type: ApplicationFiled: June 18, 2023Publication date: April 25, 2024Inventors: Wade I. Lagrone, Edwin Angkasa, Bullit Sesariza, Indra Madyasiwi, Ali Alhabsyi, Cecylia Wati, Joseph Chamdani
-
Patent number: 11682157Abstract: Techniques for a motion-based online interactive platform are described. The platform allows a teacher to visualize motions performed by a student in a perspective and how close the motions are in view of an authoritative instructor (model). Depending on implementation, the platform may be implemented as an application, a Teacher App or a student App. Each may be executed in a computer or control computer associated with an instructor or teacher or computing devices associated with students. Each of the computing devices is coupled to or includes a camera, where the camera is used by a student to show his presence or poses he performs. Data streams from the computing devices are received in the control computer.Type: GrantFiled: April 26, 2022Date of Patent: June 20, 2023Inventors: Wade I. Lagrone, Edwin Angkasa, Bullit Sesariza, Indra Madyasiwi, Ali Alhabsyi, Cecylia Wati, Joseph Chamdani
-
Publication number: 20220277506Abstract: Techniques for a motion-based online interactive platform are described. The platform makes a motion-based online class realistic and allows a teacher to visualize motions performed by a student in a perspective and how close the motion is in view of an authoritative instructor (model). Depending on implementation, the platform may be implemented as an application, a Teacher App or a student App. Each may be executed in a computer or control computer associated with an instructor or teacher or computing devices associated with students. Each of the computing devices is coupled to or includes a camera, where the camera is used by a student to show his presence or poses he performs. Data streams from the computing devices are received in the control computer, where each of the data streams includes a video and a set of sensing data. A 3D avatar of a student is generated from the sensing data in the control computer. The video is not used for generating the avatar.Type: ApplicationFiled: April 26, 2022Publication date: September 1, 2022Inventors: Wade I. Lagrone, Edwin Angkasa, Bullit Sesariza, Indra Madyasiwi, Ali Alhabsyi, Cecylia Wati, Joseph Chamdani
-
Patent number: 11238636Abstract: Techniques for sport-specific training with captured body motions are described. A computing device is provided to receive sensing data wirelessly from a plurality of sensor modules respectively attached to different body parts of a user in accordance with a particular sport. The motion of the user per the received sensing data is derived. A corresponding 3D graphic environment pertaining to the sport is provided on a display to show reference motion performed by a reference chosen by the user with the derived motion performed by the user. Differences between the reference motion and the derived motion are highlighted or animated in the 3D graphic environment.Type: GrantFiled: March 16, 2021Date of Patent: February 1, 2022Assignee: Turingsense Inc.Inventors: Joseph I. Chamdani, Pietro Garofalo, Cecylia Wati, Limin He
-
Publication number: 20210201554Abstract: Techniques for sport-specific training with captured body motions are described. A computing device is provided to receive sensing data wirelessly from a plurality of sensor modules respectively attached to different body parts of a user in accordance with a particular sport. The motion of the user per the received sensing data is derived. A corresponding 3D graphic environment pertaining to the sport is provided on a display to show reference motion performed by a reference chosen by the user with the derived motion performed by the user. Differences between the reference motion and the derived motion are highlighted or animated in the 3D graphic environment.Type: ApplicationFiled: March 16, 2021Publication date: July 1, 2021Inventors: Joseph I. Chamdani, Pietro Garofalo, Cecylia Wati, Limin He
-
Patent number: 10950025Abstract: Techniques for forming a specialized low-latency local area network with sensor modules are described. These sensor modules are respectively attached certain parts of a human body. Sensing signals or data from sensing modules are combined in a designated module, referred to as hub module, responsible for communicating with an external device. The hub module establishes a communication session with each of the remaining sensor (satellite) modules, where the session may be established over at least one channel. The hub switches proactively to another channel when there is any downgraded performance in the channel.Type: GrantFiled: March 23, 2020Date of Patent: March 16, 2021Assignee: Turingsense Inc.Inventors: Joseph I. Chamdani, Cecylia Wati
-
Publication number: 20200265628Abstract: Techniques for forming a specialized low-latency local area network with sensor modules are described. These sensor modules are respectively attached certain parts of a human body. Sensing signals or data from sensing modules are combined in a designated module, referred to as hub module, responsible for communicating with an external device. The hub module establishes a communication session with each of the remaining sensor (satellite) modules, where the session may be established over at least one channel. The hub switches proactively to another channel when there is any downgraded performance in the channel.Type: ApplicationFiled: March 23, 2020Publication date: August 20, 2020Inventors: Joseph I. Chamdani, Cecylia Wati
-
Patent number: 10672173Abstract: Techniques for storing attributes of motion and sharing the motion are described. The motion of a first user is captured and analyzed, where the attributes of motion are stored on a server or cloud. The attributes of motion are represented in a 3D anatomical coordinate system to ensure a reliable representation of an anatomy behind the motion. When accessed by a second user, an avatar is animated per the stored attributes of motion while capturing similar motion made by the second user. A stream of showing the differences in the motion by the second user and the avatar is provided to a device associated with the second user.Type: GrantFiled: May 27, 2019Date of Patent: June 2, 2020Assignee: Turingsense Inc.Inventors: Joseph I. Chamdani, Pietro Garofalo, Cecylia Wati, Jasmin Nakic, Taufik Arifin, Limin He
-
Publication number: 20190295307Abstract: Techniques for storing attributes of motion and sharing the motion are described. The motion of a first user is captured and analyzed, where the attributes of motion are stored on a server or cloud. The attributes of motion are represented in a 3D anatomical coordinate system to ensure a reliable representation of an anatomy behind the motion. When accessed by a second user, an avatar is animated per the stored attributes of motion while capturing similar motion made by the second user. A stream of showing the differences in the motion by the second user and the avatar is provided to a device associated with the second user.Type: ApplicationFiled: May 27, 2019Publication date: September 26, 2019Inventors: Joseph I. Chamdani, Pietro Garofalo, Cecylia Wati, Jasmin Nakic, Taufik Arifin, Limin He
-
Patent number: 10304230Abstract: Techniques for capturing and analyzing motion made by a person performing activities are described. According to one aspect of the present invention, sensing devices are attached to different parts of a body. As a person makes moves, the sensor modules, each including at least one inertial sensor, produce sensing data that are locally received in one designated sensing device that is in communication with an external device either remotely or locally. The combined sensing data received from the these sensing devices are processed and analyzed to derive the motions made the person. Depending on applications, various attributes of the motion can be derived from the combined sensing data, where the attributes can be incorporated into an application running on a mobile device for 3D graphics rendering into a human avatar animation.Type: GrantFiled: December 13, 2018Date of Patent: May 28, 2019Assignee: Turingsense Inc.Inventors: Joseph I. Chamdani, Pietro Garofalo, Cecylia Wati, Jasmin Nakic, Taufik Arifin, Limin He
-
Publication number: 20190122410Abstract: Techniques for capturing and analyzing motion made by a person performing activities are described. According to one aspect of the present invention, sensing devices are attached to different parts of a body. As a person makes moves, the sensor modules, each including at least one inertial sensor, produce sensing data that are locally received in one designated sensing device that is in communication with an external device either remotely or locally. The combined sensing data received from the these sensing devices are processed and analyzed to derive the motions made the person. Depending on applications, various attributes of the motion can be derived from the combined sensing data, where the attributes can be incorporated into an application running on a mobile device for 3D graphics rendering into a human avatar animation.Type: ApplicationFiled: December 13, 2018Publication date: April 25, 2019Inventors: Joseph I. Chamdani, Pietro Garofalo, Cecylia Wati, Jasmin Nakic, Taufik Arifin, Limin He
-
Patent number: 10157488Abstract: Techniques for capturing and analyzing motion made by a person performing activities are described. According to one aspect of the present invention, sensing devices are attached to different parts of a body. As a person makes moves, the sensor modules, each including at least one inertial sensor, produce sensing data that are locally received in one designated sensing device that is in communication with an external device either remotely or locally. Relying on the resources on the external device, the combined sensing data received from the these sensing devices are processed and analyzed to derive the motions made the person. Depending on applications, various attributes of the motion can be derived from the combined sensing data, where the attributes can be incorporated into an application running on a mobile device for 3D graphics rendering into a human avatar animation and motion chart analysis.Type: GrantFiled: September 20, 2016Date of Patent: December 18, 2018Assignee: Turingsense Inc.Inventors: Joseph I. Chamdani, Pietro Garofalo, Cecylia Wati, Jasmin Nakic, Taufik Arifin, Limin He
-
Publication number: 20170084070Abstract: Techniques for capturing and analyzing motion made by a person performing activities are described. According to one aspect of the present invention, sensing devices are attached to different parts of a body. As a person makes moves, the sensor modules, each including at least one inertial sensor, produce sensing data that are locally received in one designated sensing device that is in communication with an external device either remotely or locally. Relying on the resources on the external device, the combined sensing data received from the these sensing devices are processed and analyzed to derive the motions made the person. Depending on applications, various attributes of the motion can be derived from the combined sensing data, where the attributes can be incorporated into an application running on a mobile device for 3D graphics rendering into a human avatar animation and motion chart analysis.Type: ApplicationFiled: September 20, 2016Publication date: March 23, 2017Inventors: Joseph I. Chamdani, Pietro Garofalo, Cecylia Wati, Jasmin Nakic, Taufik Arifin, Limin He