Patents by Inventor Vinay Chawda

Vinay Chawda has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240103705
    Abstract: Various implementations disclosed herein facilitate interactions with a user interface in 3D environment in which a user interface element is moved based on a user movement in a way that the user interface element appears to lag behind or follow a portion of the user (e.g., the user's fingertip). The user interface element may be moved in a way that it converges with and thus catches up to the portion of the user. Such convergence may be based on the speed of the movement of the portion of the user. No convergence may occur when the portion of the user is not moving or is moving below a threshold speed. When the portion of the user is moving (e.g., above a threshold speed), the user interface component may converge with the portion of the user and the rate of convergence may be increased with faster speeds.
    Type: Application
    Filed: September 12, 2023
    Publication date: March 28, 2024
    Inventors: Vinay Chawda, Julian K. Shutzberg, Chase B. Lortie, Daniel J. Brewer, David J. Meyer, Leah M. Gum
  • Publication number: 20240103634
    Abstract: Techniques for mapping a user input motion includes detecting an input motion by a user, determining an origin for an input motion in a user-centric spherical coordinate system, determining an arc length for the input motion based on the determined origin, mapping the arc length of the input motion to a 2D plane of a user input component, and presenting a movement of a user input component on the 2D plane in accordance with the mapping.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Vinay Chawda, Chase B. Lortie, Daniel J. Brewer, Julian K. Shutzberg, Leah M. Gum, Yirong Tang, Alexander T. Wing
  • Publication number: 20240103613
    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.
    Type: Application
    Filed: September 11, 2023
    Publication date: March 28, 2024
    Inventors: Vinay Chawda, Mehmet N. Agaoglu, Leah M. Gum, Paul A. Lacey, Julian K. Shutzberg, Tim H. Cornelissen, Alexander G. Birardino
  • Patent number: 10675766
    Abstract: A system for providing a user of a virtual reality (VR) system with physical interactions with an object in the real world or in the surrounding physical space while they are concurrently interacting in the virtual world with a corresponding virtual object. The real world object is dynamic with the system including a physical interaction system that includes a robot with a manipulator for moving, positioning, and/or orienting the real world object to move it into contact with the user. For example, the physical object is moved into contact with a tracked body part of the user at a time that is synchronized with a time of an interaction event occurring in the virtual world being created by the VR system. Further, a system is described for providing a dynamic physical interaction to a human participant, e.g., a fast and compelling handover in an augmented reality (AR) system.
    Type: Grant
    Filed: June 6, 2019
    Date of Patent: June 9, 2020
    Assignee: Disney Enterprises, Inc.
    Inventors: Günter D. Niemeyer, Lanny S. Smoot, Vinay Chawda, Matthew Keith Xi-Jie Pan, Moritz Bächer, Lars Espen Knoop
  • Patent number: 10362299
    Abstract: A system for providing a user of a virtual reality (VR) system with physical interactions with an object in the real world or in the surrounding physical space while they are concurrently interacting in the virtual world with a corresponding virtual object. The real world object is dynamic with the system including a physical interaction system that includes a robot with a manipulator for moving, positioning, and/or orienting the real world object so as to move it into contact with the user. For example, the physical object is moved into contact with a tracked body part of the user, such as a hand, a tracked contact surface on the user's body, and so on, at a time that is accurately synchronized with a time of an interaction event occurring in the virtual world being created by the VR system.
    Type: Grant
    Filed: August 28, 2017
    Date of Patent: July 23, 2019
    Assignee: DISNEY ENTERPRISES, INC.
    Inventors: Gunter D. Niemeyer, Lanny S. Smoot, Vinay Chawda, Matthew Keith Xi-Jie Pan
  • Patent number: 10353467
    Abstract: Disclosed herein are methods and systems for providing haptic output and audio output on computing devices using the same haptic device and methods for calibrating the same. To produce the haptic and audio output, the computing device receives a profile of a desired output waveform that is to be provided by the haptic device. Using the desired output waveform, an input waveform is generated. Once the input waveform that will produce the desired output waveform is generated, the input waveform may be calibrated to account for various structural components of the haptic device and may also be combined with an audio waveform. The input waveform is then provided to the haptic device.
    Type: Grant
    Filed: February 18, 2016
    Date of Patent: July 16, 2019
    Assignee: Apple Inc.
    Inventors: Peteris K. Augenbergs, Marc J. Piche, Vinay Chawda, Nicole M. Wells, Scott J. McEuen, Curtis P. Wiederhold, Jonah A. Harley, Wayne C. Westerman, Jeffrey T. Bernstein, Brett W. Degner, Paul Briant, Thomas Wedlick
  • Patent number: 10254870
    Abstract: An electronic device is disclosed. The electronic device comprises a touch sensor panel configured to detect an object touching the touch sensor panel and a plurality of force sensors coupled to the touch sensor panel and configured to detect an amount of force with which the object touches the touch sensor panel. A processor is coupled to the plurality of force sensors, the processor configured to: measure a first value from a first force sensor of the plurality of force sensors; measure a second value from a second force sensor of the plurality of force sensors, different from the first force sensor; and determine a motion characteristic of the electronic device based on the first value and the second value.
    Type: Grant
    Filed: April 1, 2016
    Date of Patent: April 9, 2019
    Assignee: APPLE INC.
    Inventors: Vinay Chawda, Vikrham Gowreesunker, Alex Bijamov
  • Patent number: 9910411
    Abstract: The controller includes a differentiating engine configured to receive an input signal value (ISV), wherein the ISV corresponds to state information for one selected from a group consisting of a controlled process and a user interface. The differentiating engine is further configured to determine an error between the ISV and an estimated input signal (EIS), estimate a frequency of the IS, select a plurality of pre-determined gains using the frequency, wherein at least one plurality of pre-determined gains is a suction control gain, determine a first estimated derivative of the input signal (EDIS) using the plurality of pre-determined gains and the error, and to output the first EDIS.
    Type: Grant
    Filed: January 14, 2011
    Date of Patent: March 6, 2018
    Assignee: William Marsh Rice University
    Inventors: Ozkan Celik, Vinay Chawda, Marcia K. O'Malley
  • Publication number: 20170153760
    Abstract: An electronic device can include gain-based error tracking for improved force sensing performance. The electronic device can comprise a plurality of force sensors (e.g., coupled to a touch sensor panel configured to detect an object touching the touch sensor panel). The plurality of force sensors can be configured to detect an amount of force with which the object touches the touch sensor panel. A processor can be coupled to the plurality of force sensors, and the processor can be configured to: in accordance with a determination that an acceleration characteristic of the electronic device is less than a threshold, determine an error metric for one or more of the plurality of force sensors, and in accordance with a determination that the acceleration characteristic of the electronic device is not less than the threshold, forgo determining the error metric for one or more of the plurality of force sensors.
    Type: Application
    Filed: April 1, 2016
    Publication date: June 1, 2017
    Inventors: Vinay CHAWDA, Vikrham GOWREESUNKER, Leah M. GUM, Teera SONGATIKAMAS
  • Publication number: 20170153737
    Abstract: An electronic device is disclosed. The electronic device comprises a touch sensor panel configured to detect an object touching the touch sensor panel and a plurality of force sensors coupled to the touch sensor panel and configured to detect an amount of force with which the object touches the touch sensor panel. A processor is coupled to the plurality of force sensors, the processor configured to: measure a first value from a first force sensor of the plurality of force sensors; measure a second value from a second force sensor of the plurality of force sensors, different from the first force sensor; and determine a motion characteristic of the electronic device based on the first value and the second value.
    Type: Application
    Filed: April 1, 2016
    Publication date: June 1, 2017
    Inventors: Vinay CHAWDA, Vikrham GOWREESUNKER, Alex BIJAMOV
  • Publication number: 20170017346
    Abstract: Systems and methods for detecting user input to an electronic device are disclosed. The electronic device can include an input sensor system that itself includes an input-sensitive structure that compresses or expands in response to user input. The input sensor system measures and electrical property of the input-sensitive structure for changes. The input sensor system is coupled to an accelerometer to receive acceleration data to modify the detected changes to the input-sensitive structure.
    Type: Application
    Filed: July 15, 2016
    Publication date: January 19, 2017
    Inventors: Baboo V. Gowreesunker, Alex Bijamov, Vinay Chawda
  • Publication number: 20160259480
    Abstract: Disclosed herein are methods and systems for providing haptic output and audio output on computing devices using the same haptic device and methods for calibrating the same. To produce the haptic and audio output, the computing device receives a profile of a desired output waveform that is to be provided by the haptic device. Using the desired output waveform, an input waveform is generated. Once the input waveform that will produce the desired output waveform is generated, the input waveform may be calibrated to account for various structural components of the haptic device and may also be combined with an audio waveform. The input waveform is then provided to the haptic device.
    Type: Application
    Filed: February 18, 2016
    Publication date: September 8, 2016
    Inventors: Peteris K. Augenbergs, Marc J. Piche, Vinay Chawda, Nicole M. Wells, Scott J. McEuen, Curtis P. Wiederhold, Jonah A. Harley, Wayne C. Westerman, Jeffrey T. Bernstein, Brett W. Degner, Paul Briant, Thomas Wedlick
  • Publication number: 20130297048
    Abstract: The controller includes a differentiating engine configured to receive an input signal value (ISV), wherein the ISV corresponds to state information for one selected from a group consisting of a controlled process and a user interface. The differentiating engine is further configured to determine an error between the ISV and an estimated input signal (EIS), estimate a frequency of the IS, select a plurality of pre-determined gains using the frequency, wherein at least one plurality of pre-determined gains is a suction control gain, determine a first estimated derivative of the input signal (EDIS) using the plurality of pre-determined gains and the error, and to output the first EDIS.
    Type: Application
    Filed: January 14, 2011
    Publication date: November 7, 2013
    Applicant: WILLIAM MARSH RICE UNIVERSITY
    Inventors: Ozkan Celik, Vinay Chawda, Marcia K. O'Malley