Patents by Inventor David Scott Allmon
David Scott Allmon has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11151857Abstract: Systems and methods of determining an ergonomic assessment for a user are provided. For instance, sensor data can be received from one or more sensors implemented with an ergonomic assessment garment worn by a user. Corporeal data associated with at least one body segment of the user can be determined based at least in part on the sensor data. The corporeal data is associated with a bend angle associated with the at least one body segment. An ergonomic assessment associated with the user can be determined based at least in part on the corporeal data. The ergonomic assessment can include an indication of one or more ergonomic zones associated with the user, the one or more ergonomic zones being determined based at least in part on the bend angle associated with the at least one body segment.Type: GrantFiled: March 23, 2020Date of Patent: October 19, 2021Assignee: Google LLCInventors: Ivan Poupyrev, Antonio Xavier Cerruto, Mustafa Emre Karagozler, David Scott Allmon, Munehiko Sato, Susan Jane Wilhite, Shiho Fukuhara
-
Patent number: 11103015Abstract: This document describes techniques using, and objects embodying, an interactive fabric which is configured to sense user interactions in the form of single or multi-touch-input (e.g., gestures). The interactive fabric may be integrated into a wearable interactive garment (e.g., a jacket, shirt, or pants) that is coupled (e.g., via a wired or wireless connection) to a gesture manager. The gesture manager may be implemented at the interactive garment, or remote from the interactive garment, such as at a computing device that is wirelessly paired with the interactive garment and/or at a remote cloud based service. Generally, the gesture manager recognizes user interactions to the interactive fabric, and in response, triggers various different types of functionality, such as answering a phone call, sending a text message, creating a journal entry, and so forth.Type: GrantFiled: April 8, 2020Date of Patent: August 31, 2021Assignee: Google LLCInventors: Ivan Poupyrev, Carsten C. Schwesig, Mustafa Emre Karagozler, Hakim K. Raja, David Scott Allmon, Gerard George Pallipuram, Shiho Fukuhara, Nan-Wei Gong
-
Publication number: 20200258366Abstract: Systems and methods of determining an ergonomic assessment for a user are provided. For instance, sensor data can be received from one or more sensors implemented with an ergonomic assessment garment worn by a user. Corporeal data associated with at least one body segment of the user can be determined based at least in part on the sensor data. The corporeal data is associated with a bend angle associated with the at least one body segment. An ergonomic assessment associated with the user can be determined based at least in part on the corporeal data. The ergonomic assessment can include an indication of one or more ergonomic zones associated with the user, the one or more ergonomic zones being determined based at least in part on the bend angle associated with the at least one body segment.Type: ApplicationFiled: March 23, 2020Publication date: August 13, 2020Inventors: Ivan Poupyrev, Antonio Xavier Cerruto, Mustafa Emre Karagozler, David Scott Allmon, Munehiko Sato, Susan Jane Wilhite, Shiho Fukuhara
-
Publication number: 20200229515Abstract: This document describes techniques using, and objects embodying, an interactive fabric which is configured to sense user interactions in the form of single or multi-touch-input (e.g., gestures). The interactive fabric may be integrated into a wearable interactive garment (e.g., a jacket, shirt, or pants) that is coupled (e.g., via a wired or wireless connection) to a gesture manager. The gesture manager may be implemented at the interactive garment, or remote from the interactive garment, such as at a computing device that is wirelessly paired with the interactive garment and/or at a remote cloud based service. Generally, the gesture manager recognizes user interactions to the interactive fabric, and in response, triggers various different types of functionality, such as answering a phone call, sending a text message, creating a journal entry, and so forth.Type: ApplicationFiled: April 8, 2020Publication date: July 23, 2020Applicant: Google LLCInventors: Ivan Poupyrev, Carsten C. Schwesig, Mustafa Emre Karagozler, Hakim K. Raja, David Scott Allmon, Gerard George Pallipuram, Shiho Fukuhara, Nan-Wei Gong
-
Patent number: 10660379Abstract: This document describes techniques using, and objects embodying, an interactive fabric which is configured to sense user interactions in the form of single or multi-touch-input (e.g., gestures). The interactive fabric may be integrated into a wearable interactive garment (e.g., a jacket, shirt, or pants) that is coupled (e.g., via a wired or wireless connection) to a gesture manager. The gesture manager may be implemented at the interactive garment, or remote from the interactive garment, such as at a computing device that is wirelessly paired with the interactive garment and/or at a remote cloud based service. Generally, the gesture manager recognizes user interactions to the interactive fabric, and in response, triggers various different types of functionality, such as answering a phone call, sending a text message, creating a journal entry, and so forth.Type: GrantFiled: March 18, 2019Date of Patent: May 26, 2020Assignee: Google LLCInventors: Ivan Poupyrev, Carsten C. Schwesig, Mustafa Emre Karagozler, Hakim K. Raja, David Scott Allmon, Gerard George Pallipuram, Shiho Fukuhara, Nan-Wei Gong
-
Patent number: 10600304Abstract: Systems and methods of determining an ergonomic assessment for a user are provided. For instance, sensor data can be received from one or more sensors implemented with an ergonomic assessment garment worn by a user. Corporeal data associated with at least one body segment of the user can be determined based at least in part on the sensor data. The corporeal data is associated with a bend angle associated with the at least one body segment. An ergonomic assessment associated with the user can be determined based at least in part on the corporeal data. The ergonomic assessment can include an indication of one or more ergonomic zones associated with the user, the one or more ergonomic zones being determined based at least in part on the bend angle associated with the at least one body segment.Type: GrantFiled: June 19, 2019Date of Patent: March 24, 2020Assignee: Google LLCInventors: Ivan Poupyrev, Antonio Xavier Cerruto, Mustafa Emre Karagozler, David Scott Allmon, Munehiko Sato, Susan Jane Wilhite, Shiho Fukuhara
-
Publication number: 20190333355Abstract: Systems and methods of determining an ergonomic assessment for a user are provided. For instance, sensor data can be received from one or more sensors implemented with an ergonomic assessment garment worn by a user. Corporeal data associated with at least one body segment of the user can be determined based at least in part on the sensor data. The corporeal data is associated with a bend angle associated with the at least one body segment. An ergonomic assessment associated with the user can be determined based at least in part on the corporeal data. The ergonomic assessment can include an indication of one or more ergonomic zones associated with the user, the one or more ergonomic zones being determined based at least in part on the bend angle associated with the at least one body segment.Type: ApplicationFiled: June 19, 2019Publication date: October 31, 2019Inventors: Ivan Poupyrev, Antonio Xavier Cerruto, Mustafa Emre Karagozler, David Scott Allmon, Munehiko Sato, Susan Jane Wilhite, Shiho Fukuhara
-
Patent number: 10366593Abstract: Systems and methods of determining an ergonomic assessment for a user are provided. For instance, sensor data can be received from one or more sensors implemented with an ergonomic assessment garment worn by a user. Corporeal data associated with at least one body segment of the user can be determined based at least in part on the sensor data. The corporeal data is associated with a bend angle associated with the at least one body segment. An ergonomic assessment associated with the user can be determined based at least in part on the corporeal data. The ergonomic assessment can include an indication of one or more ergonomic zones associated with the user, the one or more ergonomic zones being determined based at least in part on the bend angle associated with the at least one body segment.Type: GrantFiled: February 8, 2017Date of Patent: July 30, 2019Assignee: Google LLCInventors: Ivan Poupyrev, Antonio Xavier Cerruto, Mustafa Emre Karagozler, David Scott Allmon, Munehiko Sato, Susan Jane Wilhite, Shiho Fukuhara
-
Publication number: 20190208837Abstract: This document describes techniques using, and objects embodying, an interactive fabric which is configured to sense user interactions in the form of single or multi-touch-input (e.g., gestures). The interactive fabric may be integrated into a wearable interactive garment (e.g., a jacket, shirt, or pants) that is coupled (e.g., via a wired or wireless connection) to a gesture manager. The gesture manager may be implemented at the interactive garment, or remote from the interactive garment, such as at a computing device that is wirelessly paired with the interactive garment and/or at a remote cloud based service. Generally, the gesture manager recognizes user interactions to the interactive fabric, and in response, triggers various different types of functionality, such as answering a phone call, sending a text message, creating a journal entry, and so forth.Type: ApplicationFiled: March 18, 2019Publication date: July 11, 2019Applicant: Google LLCInventors: Ivan Poupyrev, Carsten C. Schwesig, Mustafa Emre Karagozler, Hakim K. Raja, David Scott Allmon, Gerard George Pallipuram, Shiho Fukuhara, Nan-Wei Gong
-
Patent number: 10285456Abstract: This document describes techniques using, and objects embodying, an interactive fabric which is configured to sense user interactions in the form of single or multi-touch-input (e.g., gestures). The interactive fabric may be integrated into a wearable interactive garment (e.g., a jacket, shirt, or pants) that is coupled (e.g., via a wired or wireless connection) to a gesture manager. The gesture manager may be implemented at the interactive garment, or remote from the interactive garment, such as at a computing device that is wirelessly paired with the interactive garment and/or at a remote cloud based service. Generally, the gesture manager recognizes user interactions to the interactive fabric, and in response, triggers various different types of functionality, such as answering a phone call, sending a text message, creating a journal entry, and so forth.Type: GrantFiled: May 15, 2017Date of Patent: May 14, 2019Assignee: Google LLCInventors: Ivan Poupyrev, Carsten C. Schwesig, Mustafa Emre Karagozler, Hakim K. Raja, David Scott Allmon, Gerard George Pallipuram, Shiho Fukuhara, Nan-Wei Gong
-
Publication number: 20190051133Abstract: Systems and methods of determining an ergonomic assessment for a user are provided. For instance, sensor data can be received from one or more sensors implemented with an ergonomic assessment garment worn by a user. Corporeal data associated with at least one body segment of the user can be determined based at least in part on the sensor data. The corporeal data is associated with a bend angle associated with the at least one body segment. An ergonomic assessment associated with the user can be determined based at least in part on the corporeal data. The ergonomic assessment can include an indication of one or more ergonomic zones associated with the user, the one or more ergonomic zones being determined based at least in part on the bend angle associated with the at least one body segment.Type: ApplicationFiled: February 2, 2017Publication date: February 14, 2019Inventors: Ivan POUPYREV, Antonio Xavier CERRUTO, Mustafa Emre KARAGOZLER, David Scott ALLMON, Munehiko SATO, Susan Jane WILHITE, Shiho FUKUHARA
-
Publication number: 20170325518Abstract: This document describes techniques using, and objects embodying, an interactive fabric which is configured to sense user interactions in the form of single or multi-touch-input (e.g., gestures). The interactive fabric may be integrated into a wearable interactive garment (e.g., a jacket, shirt, or pants) that is coupled (e.g., via a wired or wireless connection) to a gesture manager. The gesture manager may be implemented at the interactive garment, or remote from the interactive garment, such as at a computing device that is wirelessly paired with the interactive garment and/or at a remote cloud based service. Generally, the gesture manager recognizes user interactions to the interactive fabric, and in response, triggers various different types of functionality, such as answering a phone call, sending a text message, creating a journal entry, and so forth.Type: ApplicationFiled: May 15, 2017Publication date: November 16, 2017Applicant: Google Inc.Inventors: Ivan Poupyrev, Carsten C. Schwesig, Mustafa Emre Karagozler, Hakim K. Raja, David Scott Allmon, Gerard George Pallipuram, Shiho Fukuhara, Nan-Wei Gong