Patents by Inventor Ivan Poupyrev
Ivan Poupyrev has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10459080Abstract: This document describes techniques and devices for radar-based object detection for vehicles. A radar-based object detection component implemented in a vehicle is configured to detect characteristics of persons within the vehicle, such as a driver or other passengers. Based on the detected characteristics, an activity of the person can be determined and various operations can be initiated based on the activity, such as initiating a warning when the driver is not paying attention to driving or automatically slowing down the vehicle. In some cases, the radar-based object detection component can also be implemented to detect characteristics of objects positioned external to the vehicle, such as pedestrians, other vehicles, or objects in the road. The radar-based object detection component may also be implemented to authenticate a driver of the vehicle, such as by detecting biometric characteristics of the driver or recognizing a series of gestures corresponding to an authentication sequence.Type: GrantFiled: October 6, 2016Date of Patent: October 29, 2019Assignee: Google LLCInventors: Carsten C. Schwesig, Ivan Poupyrev
-
Publication number: 20190321719Abstract: This document describes apparatuses and techniques for radar-enabled sensor fusion. In some aspects, a radar field is provided and reflection signals that correspond to a target in the radar field are received. The reflection signals are transformed to provide radar data, from which a radar feature indicating a physical characteristic of the target is extracted. Based on the radar features, a sensor is activated to provide supplemental sensor data associated with the physical characteristic. The radar feature is then augmented with the supplemental sensor data to enhance the radar feature, such as by increasing an accuracy or resolution of the radar feature. By so doing, performance of sensor-based applications, which rely on the enhanced radar features, can be improved.Type: ApplicationFiled: July 3, 2019Publication date: October 24, 2019Applicant: Google LLCInventors: Nicholas Edward Gillian, Carsten C. Schwesig, Jaime Lien, Patrick M. Amihood, Ivan Poupyrev
-
Patent number: 10409385Abstract: This document describes techniques and devices for occluded gesture recognition. Through use of the techniques and devices described herein, users may control their devices even when a user's gesture is occluded by some material between the user's hands and the device itself. Thus, the techniques enable users to control their mobile devices in many situations in which control is desired but conventional techniques do permit effective control, such as when a user's mobile computing device is occluded by being in a purse, bag, pocket, or even in another room.Type: GrantFiled: September 13, 2017Date of Patent: September 10, 2019Assignee: Google LLCInventor: Ivan Poupyrev
-
Patent number: 10401490Abstract: This document describes apparatuses and techniques for radar-enabled sensor fusion. In some aspects, a radar field is provided and reflection signals that correspond to a target in the radar field are received. The reflection signals are transformed to provide radar data, from which a radar feature indicating a physical characteristic of the target is extracted. Based on the radar features, a sensor is activated to provide supplemental sensor data associated with the physical characteristic. The radar feature is then augmented with the supplemental sensor data to enhance the radar feature, such as by increasing an accuracy or resolution of the radar feature. By so doing, performance of sensor-based applications, which rely on the enhanced radar features, can be improved.Type: GrantFiled: October 5, 2016Date of Patent: September 3, 2019Assignee: Google LLCInventors: Nicholas Edward Gillian, Carsten C. Schwesig, Jaime Lien, Patrick M. Amihood, Ivan Poupyrev
-
Publication number: 20190257939Abstract: A gesture component with a gesture library is described. The gesture component is configured to expose operations for execution by application of a computing device based on detected gestures. In one example, an input is detected using a three dimensional object detection system of a gesture component of the computing device. A gesture is recognized by the gesture component based on the detected input through comparison with a library of gestures maintained by the gesture component. An operation is then recognized that corresponds to the gesture by the gesture component using the library of gestures. The operation is exposed by the gesture component via an application programming interface to at least one application executed by the computing device to control performance of the operation by the at least one application.Type: ApplicationFiled: May 2, 2019Publication date: August 22, 2019Applicant: Google LLCInventors: Carsten C. Schwesig, Ivan Poupyrev
-
Patent number: 10379621Abstract: A gesture component with a gesture library is described. The gesture component is configured to expose operations for execution by application of a computing device based on detected gestures. In one example, an input is detected using a three dimensional object detection system of a gesture component of the computing device. A gesture is recognized by the gesture component based on the detected input through comparison with a library of gestures maintained by the gesture component. An operation is then recognized that corresponds to the gesture by the gesture component using the library of gestures. The operation is exposed by the gesture component via an application programming interface to at least one application executed by the computing device to control performance of the operation by the at least one application.Type: GrantFiled: October 5, 2016Date of Patent: August 13, 2019Assignee: Google LLCInventors: Carsten C. Schwesig, Ivan Poupyrev
-
Publication number: 20190243464Abstract: This document describes techniques and devices for type-agnostic radio frequency (RF) signal representations. These techniques and devices enable use of multiple different types of radar systems and fields through type-agnostic RF signal representations. By so doing, recognition and application-layer analysis can be independent of various radar parameters that differ between different radar systems and fields.Type: ApplicationFiled: April 19, 2019Publication date: August 8, 2019Applicants: Google LLC, The Board Of Trustees Of The Leland Stanford Junior UniversityInventors: Jaime Lien, Patrick M. Amihood, Ivan Poupyrev
-
Publication number: 20190232156Abstract: Techniques are described herein that enable advanced gaming and virtual reality control using radar. These techniques enable small motions and displacements to be tracked, even in the millimeter or submillimeter scale, for user control actions even when those actions are optically occluded or obscured.Type: ApplicationFiled: April 10, 2019Publication date: August 1, 2019Applicant: Google LLCInventors: Patrick M. Amihood, Ivan Poupyrev
-
Patent number: 10366593Abstract: Systems and methods of determining an ergonomic assessment for a user are provided. For instance, sensor data can be received from one or more sensors implemented with an ergonomic assessment garment worn by a user. Corporeal data associated with at least one body segment of the user can be determined based at least in part on the sensor data. The corporeal data is associated with a bend angle associated with the at least one body segment. An ergonomic assessment associated with the user can be determined based at least in part on the corporeal data. The ergonomic assessment can include an indication of one or more ergonomic zones associated with the user, the one or more ergonomic zones being determined based at least in part on the bend angle associated with the at least one body segment.Type: GrantFiled: February 8, 2017Date of Patent: July 30, 2019Assignee: Google LLCInventors: Ivan Poupyrev, Antonio Xavier Cerruto, Mustafa Emre Karagozler, David Scott Allmon, Munehiko Sato, Susan Jane Wilhite, Shiho Fukuhara
-
Publication number: 20190208837Abstract: This document describes techniques using, and objects embodying, an interactive fabric which is configured to sense user interactions in the form of single or multi-touch-input (e.g., gestures). The interactive fabric may be integrated into a wearable interactive garment (e.g., a jacket, shirt, or pants) that is coupled (e.g., via a wired or wireless connection) to a gesture manager. The gesture manager may be implemented at the interactive garment, or remote from the interactive garment, such as at a computing device that is wirelessly paired with the interactive garment and/or at a remote cloud based service. Generally, the gesture manager recognizes user interactions to the interactive fabric, and in response, triggers various different types of functionality, such as answering a phone call, sending a text message, creating a journal entry, and so forth.Type: ApplicationFiled: March 18, 2019Publication date: July 11, 2019Applicant: Google LLCInventors: Ivan Poupyrev, Carsten C. Schwesig, Mustafa Emre Karagozler, Hakim K. Raja, David Scott Allmon, Gerard George Pallipuram, Shiho Fukuhara, Nan-Wei Gong
-
Publication number: 20190187265Abstract: Techniques and devices for seamless authentication using radar are described. In some implementations, a radar field is provided through a radar-based authentication system. The radar-based authentication system can sense reflections from an object in the radar field and analyze the reflections to determine whether the object is a person. In response to determining that the object is a person, the radar-based authentication system can sense an identifying characteristic associated with the person. Based on the identifying characteristic, the radar-based authentication system can determine that the person is an authorized user.Type: ApplicationFiled: December 15, 2017Publication date: June 20, 2019Applicant: Google LLCInventors: Brandon Barbello, Leonardo Giusti, Ivan Poupyrev, Eiji Hayashi
-
Patent number: 10310620Abstract: This document describes techniques and devices for type-agnostic radio frequency (RF) signal representations. These techniques and devices enable use of multiple different types of radar systems and fields through type-agnostic RF signal representations. By so doing, recognition and application-layer analysis can be independent of various radar parameters that differ between different radar systems and fields.Type: GrantFiled: April 29, 2016Date of Patent: June 4, 2019Assignees: Google LLC, The Board of Trustees of the Leland Stanford Junior UniversityInventors: Jaime Lien, Patrick M. Amihood, Ivan Poupyrev
-
Patent number: 10300370Abstract: Techniques are described herein that enable advanced gaming and virtual reality control using radar. These techniques enable small motions and displacements to be tracked, even in the millimeter or submillimeter scale, for user control actions even when those actions are optically occluded or obscured.Type: GrantFiled: October 5, 2016Date of Patent: May 28, 2019Assignee: Google LLCInventors: Patrick M. Amihood, Ivan Poupyrev
-
Publication number: 20190155396Abstract: This document describes techniques for radio frequency (RF) based micro-motion tracking. These techniques enable even millimeter-scale hand motions to be tracked. To do so, radar signals are used from radar systems that, with conventional techniques, would only permit resolutions of a centimeter or more.Type: ApplicationFiled: January 18, 2019Publication date: May 23, 2019Applicants: Google LLC, The Board of Trustees of the Leland Stanford Junior UniversityInventors: Jaime Lien, Erik M. Olson, Patrick M. Amihood, Ivan Poupyrev
-
Patent number: 10285456Abstract: This document describes techniques using, and objects embodying, an interactive fabric which is configured to sense user interactions in the form of single or multi-touch-input (e.g., gestures). The interactive fabric may be integrated into a wearable interactive garment (e.g., a jacket, shirt, or pants) that is coupled (e.g., via a wired or wireless connection) to a gesture manager. The gesture manager may be implemented at the interactive garment, or remote from the interactive garment, such as at a computing device that is wirelessly paired with the interactive garment and/or at a remote cloud based service. Generally, the gesture manager recognizes user interactions to the interactive fabric, and in response, triggers various different types of functionality, such as answering a phone call, sending a text message, creating a journal entry, and so forth.Type: GrantFiled: May 15, 2017Date of Patent: May 14, 2019Assignee: Google LLCInventors: Ivan Poupyrev, Carsten C. Schwesig, Mustafa Emre Karagozler, Hakim K. Raja, David Scott Allmon, Gerard George Pallipuram, Shiho Fukuhara, Nan-Wei Gong
-
Publication number: 20190138109Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.Type: ApplicationFiled: January 2, 2019Publication date: May 9, 2019Applicant: Google LLCInventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
-
Patent number: 10268321Abstract: This document describes interactive textiles within hard objects. An interactive textile includes a grid of conductive thread woven into the interactive textile to form a capacitive touch sensor that is configured to detect touch-input. The interactive textile can process the touch-input to generate touch data that is useable to control various remote devices. For example, the interactive textiles may aid users in controlling volume on a stereo, pausing a movie playing on a television, or selecting a webpage on a desktop computer. Due to the flexibility of textiles, the interactive textile may be easily integrated within flexible objects, such as clothing, handbags, fabric casings, hats, and so forth. In one or more implementations, the interactive textiles may be integrated within various hard objects, such as by injection molding the interactive textile into a plastic cup, a hard casing of a smart phone, and so forth.Type: GrantFiled: October 1, 2014Date of Patent: April 23, 2019Assignee: Google LLCInventor: Ivan Poupyrev
-
Patent number: 10241581Abstract: This document describes techniques for radio frequency (RF) based micro-motion tracking. These techniques enable even millimeter-scale hand motions to be tracked. To do so, radar signals are used from radar systems that, with conventional techniques, would only permit resolutions of a centimeter or more.Type: GrantFiled: April 29, 2016Date of Patent: March 26, 2019Assignees: Google LLC, The Board of Trustees of the Lealand Stanford Junior UniversityInventors: Jaime Lien, Erik M. Olson, Patrick M. Amihood, Ivan Poupyrev
-
Patent number: 10222469Abstract: This document describes apparatuses and techniques for radar-based contextual sensing. In some aspects, a radar sensor of a device is activated to obtain radar data for a space of interest. Three-dimensional (3D) radar features are extracted from the radar data and positional data is received from sensors. Based on the positional data, spatial relation of the 3D radar features is determined to generate a set of 3D landmarks for the space. This set of 3D landmarks is then compared with known 3D context models to identify a 3D context model that matches the 3D landmarks. Based on a matching 3D context model, a context for the space is retrieved and used to configure contextual settings of the device. By so doing, contextual settings of the device be dynamically configured to address changes in context or for different device environments.Type: GrantFiled: October 6, 2016Date of Patent: March 5, 2019Assignee: Google LLCInventors: Nicholas Edward Gillian, Carsten C. Schwesig, Jaime Lien, Patrick M. Amihood, Ivan Poupyrev
-
Publication number: 20190056276Abstract: This document describes techniques using, and devices embodying, radar-based force sensing. These techniques and devices can enable a great breadth of forces to be measured. Furthermore, radar-based force sensing enables those forces to use, control, and interact with devices.Type: ApplicationFiled: August 18, 2017Publication date: February 21, 2019Applicant: Google LLCInventor: Ivan Poupyrev