Patents by Inventor Ivan Poupyrev

Ivan Poupyrev has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10310620
    Abstract: This document describes techniques and devices for type-agnostic radio frequency (RF) signal representations. These techniques and devices enable use of multiple different types of radar systems and fields through type-agnostic RF signal representations. By so doing, recognition and application-layer analysis can be independent of various radar parameters that differ between different radar systems and fields.
    Type: Grant
    Filed: April 29, 2016
    Date of Patent: June 4, 2019
    Assignees: Google LLC, The Board of Trustees of the Leland Stanford Junior University
    Inventors: Jaime Lien, Patrick M. Amihood, Ivan Poupyrev
  • Patent number: 10300370
    Abstract: Techniques are described herein that enable advanced gaming and virtual reality control using radar. These techniques enable small motions and displacements to be tracked, even in the millimeter or submillimeter scale, for user control actions even when those actions are optically occluded or obscured.
    Type: Grant
    Filed: October 5, 2016
    Date of Patent: May 28, 2019
    Assignee: Google LLC
    Inventors: Patrick M. Amihood, Ivan Poupyrev
  • Publication number: 20190155396
    Abstract: This document describes techniques for radio frequency (RF) based micro-motion tracking. These techniques enable even millimeter-scale hand motions to be tracked. To do so, radar signals are used from radar systems that, with conventional techniques, would only permit resolutions of a centimeter or more.
    Type: Application
    Filed: January 18, 2019
    Publication date: May 23, 2019
    Applicants: Google LLC, The Board of Trustees of the Leland Stanford Junior University
    Inventors: Jaime Lien, Erik M. Olson, Patrick M. Amihood, Ivan Poupyrev
  • Patent number: 10285456
    Abstract: This document describes techniques using, and objects embodying, an interactive fabric which is configured to sense user interactions in the form of single or multi-touch-input (e.g., gestures). The interactive fabric may be integrated into a wearable interactive garment (e.g., a jacket, shirt, or pants) that is coupled (e.g., via a wired or wireless connection) to a gesture manager. The gesture manager may be implemented at the interactive garment, or remote from the interactive garment, such as at a computing device that is wirelessly paired with the interactive garment and/or at a remote cloud based service. Generally, the gesture manager recognizes user interactions to the interactive fabric, and in response, triggers various different types of functionality, such as answering a phone call, sending a text message, creating a journal entry, and so forth.
    Type: Grant
    Filed: May 15, 2017
    Date of Patent: May 14, 2019
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Carsten C. Schwesig, Mustafa Emre Karagozler, Hakim K. Raja, David Scott Allmon, Gerard George Pallipuram, Shiho Fukuhara, Nan-Wei Gong
  • Publication number: 20190138109
    Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
    Type: Application
    Filed: January 2, 2019
    Publication date: May 9, 2019
    Applicant: Google LLC
    Inventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
  • Patent number: 10268321
    Abstract: This document describes interactive textiles within hard objects. An interactive textile includes a grid of conductive thread woven into the interactive textile to form a capacitive touch sensor that is configured to detect touch-input. The interactive textile can process the touch-input to generate touch data that is useable to control various remote devices. For example, the interactive textiles may aid users in controlling volume on a stereo, pausing a movie playing on a television, or selecting a webpage on a desktop computer. Due to the flexibility of textiles, the interactive textile may be easily integrated within flexible objects, such as clothing, handbags, fabric casings, hats, and so forth. In one or more implementations, the interactive textiles may be integrated within various hard objects, such as by injection molding the interactive textile into a plastic cup, a hard casing of a smart phone, and so forth.
    Type: Grant
    Filed: October 1, 2014
    Date of Patent: April 23, 2019
    Assignee: Google LLC
    Inventor: Ivan Poupyrev
  • Patent number: 10241581
    Abstract: This document describes techniques for radio frequency (RF) based micro-motion tracking. These techniques enable even millimeter-scale hand motions to be tracked. To do so, radar signals are used from radar systems that, with conventional techniques, would only permit resolutions of a centimeter or more.
    Type: Grant
    Filed: April 29, 2016
    Date of Patent: March 26, 2019
    Assignees: Google LLC, The Board of Trustees of the Lealand Stanford Junior University
    Inventors: Jaime Lien, Erik M. Olson, Patrick M. Amihood, Ivan Poupyrev
  • Patent number: 10222469
    Abstract: This document describes apparatuses and techniques for radar-based contextual sensing. In some aspects, a radar sensor of a device is activated to obtain radar data for a space of interest. Three-dimensional (3D) radar features are extracted from the radar data and positional data is received from sensors. Based on the positional data, spatial relation of the 3D radar features is determined to generate a set of 3D landmarks for the space. This set of 3D landmarks is then compared with known 3D context models to identify a 3D context model that matches the 3D landmarks. Based on a matching 3D context model, a context for the space is retrieved and used to configure contextual settings of the device. By so doing, contextual settings of the device be dynamically configured to address changes in context or for different device environments.
    Type: Grant
    Filed: October 6, 2016
    Date of Patent: March 5, 2019
    Assignee: Google LLC
    Inventors: Nicholas Edward Gillian, Carsten C. Schwesig, Jaime Lien, Patrick M. Amihood, Ivan Poupyrev
  • Publication number: 20190056276
    Abstract: This document describes techniques using, and devices embodying, radar-based force sensing. These techniques and devices can enable a great breadth of forces to be measured. Furthermore, radar-based force sensing enables those forces to use, control, and interact with devices.
    Type: Application
    Filed: August 18, 2017
    Publication date: February 21, 2019
    Applicant: Google LLC
    Inventor: Ivan Poupyrev
  • Publication number: 20190051133
    Abstract: Systems and methods of determining an ergonomic assessment for a user are provided. For instance, sensor data can be received from one or more sensors implemented with an ergonomic assessment garment worn by a user. Corporeal data associated with at least one body segment of the user can be determined based at least in part on the sensor data. The corporeal data is associated with a bend angle associated with the at least one body segment. An ergonomic assessment associated with the user can be determined based at least in part on the corporeal data. The ergonomic assessment can include an indication of one or more ergonomic zones associated with the user, the one or more ergonomic zones being determined based at least in part on the bend angle associated with the at least one body segment.
    Type: Application
    Filed: February 2, 2017
    Publication date: February 14, 2019
    Inventors: Ivan POUPYREV, Antonio Xavier CERRUTO, Mustafa Emre KARAGOZLER, David Scott ALLMON, Munehiko SATO, Susan Jane WILHITE, Shiho FUKUHARA
  • Patent number: 10203763
    Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
    Type: Grant
    Filed: October 5, 2015
    Date of Patent: February 12, 2019
    Inventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
  • Publication number: 20190033981
    Abstract: This document describes techniques using, and devices embodying, wide-field radar-based gesture recognition. These techniques and devices can enable a great breadth of gestures and uses for those gestures, such as gestures to use, control, and interact with computing and non-computing devices, from software applications to refrigerators.
    Type: Application
    Filed: October 5, 2018
    Publication date: January 31, 2019
    Applicant: Google LLC
    Inventor: Ivan Poupyrev
  • Publication number: 20190011989
    Abstract: A gesture component with a gesture library is described. The gesture component is configured to expose operations for execution by application of a computing device based on detected gestures. In one example, an input is detected using a three dimensional object detection system of a gesture component of the computing device. A gesture is recognized by the gesture component based on the detected input through comparison with a library of gestures maintained by the gesture component. An operation is then recognized that corresponds to the gesture by the gesture component using the library of gestures. The operation is exposed by the gesture component via an application programming interface to at least one application executed by the computing device to control performance of the operation by the at least one application.
    Type: Application
    Filed: October 5, 2016
    Publication date: January 10, 2019
    Applicant: Google Inc.
    Inventors: Carsten C. Schwesig, Ivan Poupyrev
  • Patent number: 10175781
    Abstract: This document describes an interactive object with multiple electronics modules. An interactive object (e.g., a garment) includes a grid or array of conductive thread woven into the interactive object, and an internal electronics module coupled to the grid of conductive thread. The internal electronics module includes a first subset of electronic components, such as sensing circuitry configured to detect touch-input to the grid of conductive thread. An external electronics module that includes a second subset of electronic components (e.g., a microprocessor, power source, or network interface) is removably coupled to the interactive object via a communication interface. The communication interface enables communication between the internal electronics module and the external electronics module when the external electronics module is coupled to the interactive object.
    Type: Grant
    Filed: November 28, 2016
    Date of Patent: January 8, 2019
    Assignee: Google LLC
    Inventors: Mustafa Emre Karagozler, Ivan Poupyrev, Carsten C. Schwesig, Hakim K. Raja, Nan-Wei Gong, Shiho Fukuhara, Karen Elizabeth Robinson
  • Patent number: 10155274
    Abstract: This document describes techniques and apparatuses for attaching electronic components to interactive textiles. In various implementations, an interactive textile that includes conductive thread woven into the interactive textile is received. The conductive thread includes a conductive wire (e.g., a copper wire) that that is twisted, braided, or wrapped with one or more flexible threads (e.g., polyester or cotton threads). A fabric stripping process is applied to the interactive textile to strip away fabric of the interactive textile and the flexible threads to expose the conductive wire in a window of the interactive textile. After exposing the conductive wires in the window of the interactive textile, an electronic component (e.g., a flexible circuit board) is attached to the exposed conductive wire of the conductive thread in the window of the interactive textile.
    Type: Grant
    Filed: May 3, 2017
    Date of Patent: December 18, 2018
    Assignee: Google LLC
    Inventors: Karen Elizabeth Robinson, Nan-Wei Gong, Mustafa Emre Karagozler, Ivan Poupyrev
  • Patent number: 10139916
    Abstract: This document describes techniques using, and devices embodying, wide-field radar-based gesture recognition. These techniques and devices can enable a great breadth of gestures and uses for those gestures, such as gestures to use, control, and interact with computing and non-computing devices, from software applications to refrigerators.
    Type: Grant
    Filed: April 29, 2016
    Date of Patent: November 27, 2018
    Assignee: Google LLC
    Inventor: Ivan Poupyrev
  • Patent number: 10126123
    Abstract: According to embodiments of the present invention are a system and method that use projected structured patterns of light and linear optical sensors for motion tracking. Sensors are capable of recovering two-dimensional location within the projection area, while several sensors can be combined for up to six degrees of freedom tracking. The structure patterns are based on m-sequences, in which any consecutive subsequence of m bits is unique. Both digital and static light sources can be used. The system and method of the present invention enables high-speed, high precision, and low-cost motion tracking for a wide range of applications.
    Type: Grant
    Filed: September 21, 2015
    Date of Patent: November 13, 2018
    Assignees: CARNEGIE MELLON UNIVERSITY, DISNEY ENTERPRISES, INC.
    Inventors: Christopher Harrison, Bo Robert Xiao, Scott E. Hudson, Ivan Poupyrev, Karl D. D. Willis
  • Publication number: 20180310659
    Abstract: This document describes an interactive object with at least one electronics module and a touch sensor. The interactive object may be a garment, garment accessory, or garment container. The interactive object may be configured to provide at least a haptic, audio, or visual output. The interactive object may also contain conductive threads and a touch sensor containing said conductive threads.
    Type: Application
    Filed: April 26, 2018
    Publication date: November 1, 2018
    Inventors: Ivan Poupyrev, Leonardo Giusti, Mauricio Gutierrez Bravo
  • Publication number: 20180310644
    Abstract: This document describes an interactive object with at least one electronics module and a touch sensor. The interactive object may be a garment, garment accessory, or garment container. The interactive object may be configured to provide at least a haptic, audio, or visual output. The interactive object may also contain conductive threads and a touch sensor containing said conductive threads.
    Type: Application
    Filed: April 26, 2018
    Publication date: November 1, 2018
    Inventors: Ivan Poupyrev, Leonardo Giusti, Mauricio Gutierrez Bravo
  • Patent number: 10088908
    Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
    Type: Grant
    Filed: September 23, 2015
    Date of Patent: October 2, 2018
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop