Patents by Inventor Ivan Poupyrev

Ivan Poupyrev has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11169988
    Abstract: This document describes techniques and devices for a radar recognition-aided search. Through use of a radar-based recognition system, gestures made by, and physiological information about, persons can be determined. In the case of physiological information, the techniques can use this information to refine a search. For example, if a person requests a search for a coffee shop, the techniques may refine the search to coffee shops in the direction that the person is walking. In the case of a gesture, the techniques may refine or base a search solely on the gesture. Thus, a search for information about a store, car, or tree can be made responsive to a gesture pointing at the store, car, or tree with or without explicit entry of a search query.
    Type: Grant
    Filed: October 1, 2014
    Date of Patent: November 9, 2021
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Gaetano Roberto Aiello
  • Publication number: 20210342008
    Abstract: This document describes techniques and systems for radar-based gesture-recognition with context-sensitive gating and other context-sensitive controls. Sensor data from a proximity sensor (108) and/or a movement sensor (108) produces a context of a user equipment (102). The techniques and systems enable the user equipment (102) to recognize contexts when a radar system (104) can be unreliable and should not be used for gesture-recognition, enabling the user equipment (102) to automatically disable or “gate” the output from the radar system (104) according to context. The user equipment (102) prevents the radar system (104) from transitioning to a high-power state (1910) to perform gesture-recognition in contexts where radar data detected by the radar system (104) is likely due to unintentional input. By so doing, the techniques conserve power, improve accuracy, or reduce latency relative to many common techniques and systems for radar-based gesture-recognition.
    Type: Application
    Filed: September 27, 2019
    Publication date: November 4, 2021
    Applicant: Google LLC
    Inventors: Vignesh Sachidanandam, Ivan Poupyrev, Leonardo Giusti, Devon James O'Reilley Stern, Jung Ook Hong, Patrick M. Amihood, John David Jacobs, Abel Seleshi Mengistu, Brandon Barbello, Tyler Reed Kugler
  • Patent number: 11163371
    Abstract: This document describes techniques and devices for non-line-of-sight radar-based gesture recognition. Through use of the techniques and devices described herein, users may control their devices through in-the-air gestures, even when those gestures are not within line-of-sight of their device's sensors. Thus, the techniques enable users to control their devices in many situations in which control is desired but conventional techniques do permit effective control, such as to turn the temperature down in a room when the user is obscured from a thermostat's gesture sensor, turn up the volume on a media player when the user is in a different room than the media player, or pause a television program when the user's gesture is obscured by a chair, couch, or other obstruction.
    Type: Grant
    Filed: March 18, 2020
    Date of Patent: November 2, 2021
    Assignee: Google LLC
    Inventor: Ivan Poupyrev
  • Publication number: 20210326642
    Abstract: Various embodiments dynamically learn user-customizable input gestures. A user can transition a radar-based gesture detection system into a gesture-learning mode. In turn, the radar-based gesture detection system emits a radar field configured to detect a gesture new to the radar-based gesture detection system. The radar-based gesture detection system receives incoming radio frequency (RF) signals generated by the outgoing RF signal reflecting off the gesture, and analyzes the incoming RF signals to learn one or more identifying characteristics about the gesture. Upon learning the identifying characteristics, the radar-based gesture detection system reconfigures a corresponding input identification system to detect the gesture when the one or more identifying characteristics are next identified, and transitions out of the gesture-learning mode.
    Type: Application
    Filed: June 29, 2021
    Publication date: October 21, 2021
    Applicant: Google LLC
    Inventors: Nicholas Edward Gillian, Jaime Lien, Patrick M. Amihood, Ivan Poupyrev
  • Patent number: 11151857
    Abstract: Systems and methods of determining an ergonomic assessment for a user are provided. For instance, sensor data can be received from one or more sensors implemented with an ergonomic assessment garment worn by a user. Corporeal data associated with at least one body segment of the user can be determined based at least in part on the sensor data. The corporeal data is associated with a bend angle associated with the at least one body segment. An ergonomic assessment associated with the user can be determined based at least in part on the corporeal data. The ergonomic assessment can include an indication of one or more ergonomic zones associated with the user, the one or more ergonomic zones being determined based at least in part on the bend angle associated with the at least one body segment.
    Type: Grant
    Filed: March 23, 2020
    Date of Patent: October 19, 2021
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Antonio Xavier Cerruto, Mustafa Emre Karagozler, David Scott Allmon, Munehiko Sato, Susan Jane Wilhite, Shiho Fukuhara
  • Patent number: 11140787
    Abstract: This document describes techniques and apparatuses for connecting an electronic component to an interactive textile. Loose conductive threads of the interactive textile are collected and organized into a ribbon with a pitch that matches a corresponding pitch of connection points of the electronic component. Next, non-conductive material of the conductive threads of the ribbon are stripped to expose the conductive wires of the conductive threads. After stripping the non-conductive material from the conductive threads of the ribbon, the connection points of the electronic component are bonded to the conductive wires of the ribbon. The conductive threads proximate the ribbon are then sealed using a UV-curable or heat-curable epoxy, and the electronic component and the ribbon are encapsulated to the interactive textile with a water-resistant material, such as plastic or polymer.
    Type: Grant
    Filed: September 6, 2019
    Date of Patent: October 5, 2021
    Assignee: Google LLC
    Inventors: Mustafa Emre Karagozler, Ivan Poupyrev, Nan-Wei Gong, Karen Elizabeth Robinson, Patricia Hayes-Danitz, Megan Grant
  • Patent number: 11132065
    Abstract: This document describes apparatuses and techniques for radar-enabled sensor fusion. In some aspects, a radar field is provided and reflection signals that correspond to a target in the radar field are received. The reflection signals are transformed to provide radar data, from which a radar feature indicating a physical characteristic of the target is extracted. Based on the radar features, a sensor is activated to provide supplemental sensor data associated with the physical characteristic. The radar feature is then augmented with the supplemental sensor data to enhance the radar feature, such as by increasing an accuracy or resolution of the radar feature. By so doing, performance of sensor-based applications, which rely on the enhanced radar features, can be improved.
    Type: Grant
    Filed: July 3, 2019
    Date of Patent: September 28, 2021
    Assignee: Google LLC
    Inventors: Nicholas Edward Gillian, Carsten C. Schwesig, Jaime Lien, Patrick M. Amihood, Ivan Poupyrev
  • Patent number: 11103015
    Abstract: This document describes techniques using, and objects embodying, an interactive fabric which is configured to sense user interactions in the form of single or multi-touch-input (e.g., gestures). The interactive fabric may be integrated into a wearable interactive garment (e.g., a jacket, shirt, or pants) that is coupled (e.g., via a wired or wireless connection) to a gesture manager. The gesture manager may be implemented at the interactive garment, or remote from the interactive garment, such as at a computing device that is wirelessly paired with the interactive garment and/or at a remote cloud based service. Generally, the gesture manager recognizes user interactions to the interactive fabric, and in response, triggers various different types of functionality, such as answering a phone call, sending a text message, creating a journal entry, and so forth.
    Type: Grant
    Filed: April 8, 2020
    Date of Patent: August 31, 2021
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Carsten C. Schwesig, Mustafa Emre Karagozler, Hakim K. Raja, David Scott Allmon, Gerard George Pallipuram, Shiho Fukuhara, Nan-Wei Gong
  • Patent number: 11079289
    Abstract: This document describes techniques using, and devices embodying, radar-based force sensing. These techniques and devices can enable a great breadth of forces to be measured. Furthermore, radar-based force sensing enables those forces to use, control, and interact with devices.
    Type: Grant
    Filed: August 18, 2017
    Date of Patent: August 3, 2021
    Assignee: Google LLC
    Inventor: Ivan Poupyrev
  • Patent number: 11080556
    Abstract: Various embodiments dynamically learn user-customizable input gestures. A user can transition a radar-based gesture detection system into a gesture-learning mode. In turn, the radar-based gesture detection system emits a radar field configured to detect a gesture new to the radar-based gesture detection system. The radar-based gesture detection system receives incoming radio frequency (RF) signals generated by the outgoing RF signal reflecting off the gesture, and analyzes the incoming RF signals to learn one or more identifying characteristics about the gesture. Upon learning the identifying characteristics, the radar-based gesture detection system reconfigures a corresponding input identification system to detect the gesture when the one or more identifying characteristics are next identified, and transitions out of the gesture-learning mode.
    Type: Grant
    Filed: October 6, 2016
    Date of Patent: August 3, 2021
    Assignee: Google LLC
    Inventors: Nicholas Edward Gillian, Jaime Lien, Patrick M. Amihood, Ivan Poupyrev
  • Publication number: 20210232303
    Abstract: Systems and methods of providing gesture-based control of a user interface are provided. For instance, a presence of a control article can be detected in a first proximity zone proximate a user device. Responsive to detecting the presence of the control article, presentation data corresponding to a presentation mode of a user interface associated with the user computing device can be provided for display. A presence of the control article can be detected in a second proximity zone proximate the user computing device. The second proximity zone can define a separate physical area than the first proximity zone. Responsive to detecting the presence of the control article in the second proximity zone, interactive data corresponding to an interactive mode of the user interface can be provided for display.
    Type: Application
    Filed: April 12, 2021
    Publication date: July 29, 2021
    Applicant: Google LLC
    Inventors: Ivan Poupyrev, Carsten C. Schwesig, Jack Schulze, Timo Arnall
  • Publication number: 20210216153
    Abstract: A removable electronics device and related pre-fabricated sensor assemblies having different sensor layouts are provided. The removable electronics module includes one or more processors, an inertial measurement unit, a first communication interface configured to communicatively couple the removable electronics device to one or more computing devices, a second communication interface configured to communicatively couple the removable electronics device to a plurality of pre-fabricated sensor assemblies, and a housing at least partially enclosing the processor, the inertial measurement unit, the first communication interface, and the second communication interface. The housing includes a first opening in at least one longitudinal surface and adjacent to at least a portion of the first communication interface and a plurality of second openings in a lower surface and adjacent to the plurality of contact pads of the second communication interface.
    Type: Application
    Filed: March 29, 2021
    Publication date: July 15, 2021
    Inventors: Kishore Sundara-Rajan, Mauricio E. Gutierrez Bravo, Ivan Poupyrev, Alejandro Kauffmann
  • Patent number: 11003345
    Abstract: Systems and methods of providing control-article-based control of a user interface are provided. For instance, a presence of a control article can be detected in a first proximity zone proximate a user device. Responsive to detecting the presence of the control article, presentation data corresponding to a presentation mode of a user interface associated with the user computing device can be provided for display. A presence of the control article can be detected in a second proximity zone proximate the user computing device. The second proximity zone can define a separate physical area than the first proximity zone. Responsive to detecting the presence of the control article in the second proximity zone, interactive data corresponding to an interactive mode of the user interface can be provided for display.
    Type: Grant
    Filed: December 7, 2016
    Date of Patent: May 11, 2021
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Carsten C. Schwesig, Jack Schulze, Timo Arnall
  • Publication number: 20210132702
    Abstract: This document describes techniques and devices for occluded gesture recognition. Through use of the techniques and devices described herein, users may control their devices even when a user's gesture is occluded by some material between the user's hands and the device itself. Thus, the techniques enable users to control their mobile devices in many situations in which control is desired but conventional techniques do permit effective control, such as when a user's mobile computing device is occluded by being in a purse, bag, pocket, or even in another room.
    Type: Application
    Filed: January 13, 2021
    Publication date: May 6, 2021
    Applicant: Google LLC
    Inventor: Ivan Poupyrev
  • Publication number: 20210132788
    Abstract: This document describes techniques and systems that enable a radar system facilitating ease and accuracy of user interactions with a user interface. The techniques and systems can be implemented in an electronic device, such as a smartphone, and use a radar field to accurately determine three-dimensional (3D) gestures that can be used in combination with other inputs, such as touch or voice inputs, to interact with the user interface. These techniques allow the user to make 3D gestures from a distance and enable seamless integration of touch and voice commands with 3D gestures to improve functionality and user enjoyment.
    Type: Application
    Filed: January 13, 2021
    Publication date: May 6, 2021
    Applicant: Google LLC
    Inventors: Leonardo Giusti, Ivan Poupyrev, Patrick M. Amihood
  • Publication number: 20210110717
    Abstract: An interactive object and computing devices are configured to provide vehicle-related notifications and gesture detection to enable user interaction with a vehicle service. A computing system can receive data associated with a status of a vehicle that is providing a vehicle service associated with a user of an interactive object. The computing system can provide one or more output signals to one or more output devices of the interactive object. The one or more output signals are based at least in part on the data associated with the status of the vehicle. The computing system can provide, in response to the one or more output signals, an output response indicative of the status of the vehicle.
    Type: Application
    Filed: April 18, 2019
    Publication date: April 15, 2021
    Applicant: Google LLC
    Inventors: Leonardo GIUSTI, Ivan POUPYREV, Suniti Nina WALIA
  • Publication number: 20210096653
    Abstract: Techniques are described herein that enable advanced gaming and virtual reality control using radar. These techniques enable small motions and displacements to be tracked, even in the millimeter or submillimeter scale, for user control actions even when those actions are optically occluded or obscured.
    Type: Application
    Filed: December 11, 2020
    Publication date: April 1, 2021
    Applicant: Google LLC
    Inventors: Patrick M. Amihood, Ivan Poupyrev
  • Patent number: 10963106
    Abstract: A removable electronics device and related pre-fabricated sensor assemblies having different sensor layouts are provided. The removable electronics module includes one or more processors, an inertial measurement unit, a first communication interface configured to communicatively couple the removable electronics device to one or more computing devices, a second communication interface configured to communicatively couple the removable electronics device to a plurality of pre-fabricated sensor assemblies, and a housing at least partially enclosing the processor, the inertial measurement unit, the first communication interface, and the second communication interface. The housing includes a first opening in at least one longitudinal surface and adjacent to at least a portion of the first communication interface and a plurality of second openings in a lower surface and adjacent to the plurality of contact pads of the second communication interface.
    Type: Grant
    Filed: August 26, 2019
    Date of Patent: March 30, 2021
    Assignee: Google LLC
    Inventors: Kishore Sundara-Rajan, Mauricio E. Gutierrez Bravo, Ivan Poupyrev, Alejandro Kauffmann, Mustafa Emre Karagozler
  • Publication number: 20210088643
    Abstract: This document describes techniques and systems that enable a mobile device-based radar system (104) for providing a multi-mode interface (114). A radar field (110) is used to enable a user device (102, 702) to accurately determine a presence or threshold movement of a user near the user device. The user device provides a multi-mode interface having at least first and second modes and providing a black display or a low-luminosity display in the first mode. The user device detects, based on radar data and during the first mode, a presence or threshold movement by the user relative to the user device and responsively changes the multi-mode interface from the first mode to the second mode. Responsive to the change to the second mode, the user device provides visual feedback corresponding to the implicit interaction by adjusting one or more display parameters of the black display or the low-luminosity display.
    Type: Application
    Filed: May 20, 2019
    Publication date: March 25, 2021
    Applicant: Google LLC
    Inventors: Eiji Hayashi, Vignesh Sachidanandam, Leonardo Giusti, Jaime Lien, Patrick M. Amihood, Ivan Poupyrev
  • Patent number: 10948996
    Abstract: This document describes techniques and devices for radar-based gesture-recognition through a wearable device. The techniques enable an easy-to-use input interface through this wearable radar device, in contrast to small or difficult-to-use input interfaces common to wearable computing devices. Further, these techniques are not limited to interfacing with wearable computing devices, but may aid users in controlling various non-wearable devices, such as to control volume on a stereo, pause a movie playing on a television, or select a webpage on a desktop computer.
    Type: Grant
    Filed: October 31, 2019
    Date of Patent: March 16, 2021
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Gaetano Roberto Aiello