Patents by Inventor Uday Parshionikar

Uday Parshionikar has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11977677
    Abstract: User interaction concepts, principles and algorithms for gestures involving facial expressions, motion or orientation of body parts, eye gaze, tightening muscles, mental activity, and other user actions are disclosed. User interaction concepts, principles and algorithms for enabling hands-free and voice-free interaction with electronic devices are disclosed. Apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions, implementing the disclosed concepts, principles and algorithms are disclosed. Gestures for systems using eye gaze and head tracking that can be used with augmented, mixed or virtual reality, mobile or desktop computing are disclosed. Use of periods of limited activity and consecutive user actions in orthogonal axes is disclosed. Generation of command signals based on start and end triggers is disclosed. Methods for coarse as well as fine modification of objects are disclosed.
    Type: Grant
    Filed: August 2, 2022
    Date of Patent: May 7, 2024
    Inventor: Uday Parshionikar
  • Publication number: 20230004232
    Abstract: Method and apparatus is disclosed for a user to communicate with an electronic device. A processor receives user intention actions comprising facial expression (FE) information indicative of facial expressions and body information indicative of motion or position of one or more body parts of the user. When the FE or body information crosses a first level, the processor starts generating first signals based on the FE or body information to communicate with the electronic device. When the FE or body information crosses a second level, the processor can end generation of the first signals or modify the first signals. An image processing or eye gaze tracking system can provide some FE information or body information. The signals can modify attributes of an object of interest. Use of thresholds that are independent of sensor position or orientation with respect to the user's body are also disclosed.
    Type: Application
    Filed: September 14, 2022
    Publication date: January 5, 2023
    Inventor: Uday Parshionikar
  • Publication number: 20220374078
    Abstract: User interaction concepts, principles and algorithms for gestures involving facial expressions, motion or orientation of body parts, eye gaze, tightening muscles, mental activity, and other user actions are disclosed. User interaction concepts, principles and algorithms for enabling hands-free and voice-free interaction with electronic devices are disclosed. Apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions, implementing the disclosed concepts, principles and algorithms are disclosed. Gestures for systems using eye gaze and head tracking that can be used with augmented, mixed or virtual reality, mobile or desktop computing are disclosed. Use of periods of limited activity and consecutive user actions in orthogonal axes is disclosed. Generation of command signals based on start and end triggers is disclosed. Methods for coarse as well as fine modification of objects are disclosed.
    Type: Application
    Filed: August 2, 2022
    Publication date: November 24, 2022
    Inventor: Uday Parshionikar
  • Patent number: 11481037
    Abstract: Method and apparatus is disclosed for a user to communicate with an electronic device. A processor receives user intention actions comprising facial expression (FE) information indicative of facial expressions and body information indicative of motion or position of one or more body parts of the user. When the FE or body information crosses a first level, the processor starts generating first signals based on the FE or body information to communicate with the electronic device. When the FE or body information crosses a second level, the processor can end generation of the first signals or modify the first signals. An image processing or eye gaze tracking system can provide some FE information or body information. The signals can modify attributes of an object of interest. Use of thresholds that are independent of sensor position or orientation with respect to the user's body are also disclosed.
    Type: Grant
    Filed: January 15, 2021
    Date of Patent: October 25, 2022
    Assignee: PERCEPTIVE DEVICES LLC
    Inventor: Uday Parshionikar
  • Patent number: 11402902
    Abstract: User interaction concepts, principles and algorithms for gestures involving facial expressions, motion or orientation of body parts, eye gaze, tightening muscles, mental activity, and other user actions are disclosed. User interaction concepts, principles and algorithms for enabling hands-free and voice-free interaction with electronic devices are disclosed. Apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions, implementing the disclosed concepts, principles and algorithms are disclosed. Gestures for systems using eye gaze and head tracking that can be used with augmented, mixed or virtual reality, mobile or desktop computing are disclosed. Use of periods of limited activity and consecutive user actions in orthogonal axes is disclosed. Generation of command signals based on start and end triggers is disclosed. Methods for coarse as well as fine modification of objects are disclosed.
    Type: Grant
    Filed: January 5, 2021
    Date of Patent: August 2, 2022
    Assignee: Perceptive Devices LLC
    Inventor: Uday Parshionikar
  • Publication number: 20210157402
    Abstract: User interaction concepts, principles and algorithms for gestures involving facial expressions, motion or orientation of body parts, eye gaze, tightening muscles, mental activity, and other user actions are disclosed. User interaction concepts, principles and algorithms for enabling hands-free and voice-free interaction with electronic devices are disclosed. Apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions, implementing the disclosed concepts, principles and algorithms are disclosed. Gestures for systems using eye gaze and head tracking that can be used with augmented, mixed or virtual reality, mobile or desktop computing are disclosed. Use of periods of limited activity and consecutive user actions in orthogonal axes is disclosed. Generation of command signals based on start and end triggers is disclosed. Methods for coarse as well as fine modification of objects are disclosed.
    Type: Application
    Filed: January 5, 2021
    Publication date: May 27, 2021
    Inventor: Uday Parshionikar
  • Publication number: 20210132703
    Abstract: Method and apparatus is disclosed for a user to communicate with an electronic device. A processor receives user intention actions comprising facial expression (FE) information indicative of facial expressions and body information indicative of motion or position of one or more body parts of the user. When the FE or body information crosses a first level, the processor starts generating first signals based on the FE or body information to communicate with the electronic device. When the FE or body information crosses a second level, the processor can end generation of the first signals or modify the first signals. An image processing or eye gaze tracking system can provide some FE information or body information. The signals can modify attributes of an object of interest. Use of thresholds that are independent of sensor position or orientation with respect to the user's body are also disclosed.
    Type: Application
    Filed: January 15, 2021
    Publication date: May 6, 2021
    Inventor: Uday Parshionikar
  • Patent number: 10895917
    Abstract: Method and apparatus is disclosed for a user to communicate with an electronic device. A processor receives user intention actions comprising facial expression (FE) information indicative of facial expressions and body information indicative of motion or position of one or more body parts of the user. When the FE or body information crosses a first level, the processor starts generating first signals based on the FE or body information to communicate with the electronic device. When the FE or body information crosses a second level, the processor can end generation of the first signals or modify the first signals. An image processing or eye gaze tracking system can provide some FE information or body information. The signals can modify attributes of an object of interest. Use of thresholds that are independent of sensor position or orientation with respect to the user's body are also disclosed.
    Type: Grant
    Filed: January 29, 2019
    Date of Patent: January 19, 2021
    Inventor: Uday Parshionikar
  • Patent number: 10884493
    Abstract: User interaction concepts, principles and algorithms for gestures involving facial expressions, motion or orientation of body parts, eye gaze, tightening muscles, mental activity, and other user actions are disclosed. User interaction concepts, principles and algorithms for enabling hands-free and voice-free interaction with electronic devices are disclosed. Apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions, implementing the disclosed concepts, principles and algorithms are disclosed. Gestures for systems using eye gaze and head tracking that can be used with augmented, mixed or virtual reality, mobile or desktop computing are disclosed. Use of periods of limited activity and consecutive user actions in orthogonal axes is disclosed. Generation of command signals based on start and end triggers is disclosed. Methods for coarse as well as fine modification of objects are disclosed.
    Type: Grant
    Filed: December 24, 2019
    Date of Patent: January 5, 2021
    Inventor: Uday Parshionikar
  • Publication number: 20200249752
    Abstract: User interaction concepts, principles and algorithms for gestures involving facial expressions, motion or orientation of body parts, eye gaze, tightening muscles, mental activity, and other user actions are disclosed. User interaction concepts, principles and algorithms for enabling hands-free and voice-free interaction with electronic devices are disclosed. Apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions, implementing the disclosed concepts, principles and algorithms are disclosed. Gestures for systems using eye gaze and head tracking that can be used with augmented, mixed or virtual reality, mobile or desktop computing are disclosed. Use of periods of limited activity and consecutive user actions in orthogonal axes is disclosed. Generation of command signals based on start and end triggers is disclosed. Methods for coarse as well as fine modification of objects are disclosed.
    Type: Application
    Filed: December 24, 2019
    Publication date: August 6, 2020
    Inventor: Uday Parshionikar
  • Patent number: 10558272
    Abstract: Methods of interpreting user actions for controlling electronic devices, as well as apparatuses and systems implementing the methods. User actions can involve eyes, head, face, fingers, hands, arms, other body parts, as well as facial, verbal and/or mental actions. Hand and/or voice free control of devices. Methods for large as well as fine motion and placement of objects. Use of actions before and during other actions to confirm the intent of the other actions. Objects can be moved or warped based on combinations of actions. Measurement of eye gaze, and iterations and helper signals for improved accuracy of control. Triggers to start and end recognition of user actions and generation of device commands.
    Type: Grant
    Filed: March 14, 2018
    Date of Patent: February 11, 2020
    Inventor: Uday Parshionikar
  • Publication number: 20190324551
    Abstract: Method and apparatus is disclosed for a user to communicate with an electronic device. A processor receives user intention actions comprising facial expression (FE) information indicative of facial expressions and body information indicative of motion or position of one or more body parts of the user. When the FE or body information crosses a first level, the processor starts generating first signals based on the FE or body information to communicate with the electronic device. When the FE or body information crosses a second level, the processor can end generation of the first signals or modify the first signals. An image processing or eye gaze tracking system can provide some FE information or body information. The signals can modify attributes of an object of interest. Use of thresholds that are independent of sensor position or orientation with respect to the user's body are also disclosed.
    Type: Application
    Filed: January 29, 2019
    Publication date: October 24, 2019
    Inventor: Uday Parshionikar
  • Publication number: 20190265802
    Abstract: User interaction concepts, principles and algorithms for gestures involving facial expressions, motion or orientation of body parts, eye gaze, tightening muscles, mental activity, and other user actions are disclosed. User interaction concepts, principles and algorithms for enabling hands-free and voice-free interaction with electronic devices are disclosed. Apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions, implementing the disclosed concepts, principles and algorithms are disclosed. Gestures for systems using eye gaze and head tracking that can be used with augmented, mixed or virtual reality, mobile or desktop computing are disclosed. Use of periods of limited activity and consecutive user actions in orthogonal axes is disclosed. Generation of command signals based on start and end triggers is disclosed.
    Type: Application
    Filed: November 27, 2018
    Publication date: August 29, 2019
    Inventor: Uday Parshionikar
  • Patent number: 10254844
    Abstract: This application includes disclosure of methods, systems, apparatuses as well as principles/algorithms that can be implemented on computer readable medium, for defining user gestures, interpreting user actions, communicating and confirming user intent when communicating with electronic devices. A system for controlling an electronic device by a user is disclosed that includes a microprocessor and a communication link. The microprocessor runs control software for receiving a first signal indicative of a first action of the user, receives a second signal indicative of motion or position of a part of the user's body, and generates a command signal for the electronic device based on a user gesture performed by the user. The communication link communicates the command signal to the electronic device.
    Type: Grant
    Filed: June 20, 2014
    Date of Patent: April 9, 2019
    Inventor: Uday Parshionikar
  • Patent number: 10191558
    Abstract: Method and apparatus is disclosed for a user to communicate with an electronic device. A processor receives user intention actions comprising facial expression (FE) information indicative of facial expressions and body information indicative of motion or position of one or more body parts of the user. When the FE or body information crosses a first level, the processor starts generating first signals based on the FE or body information to communicate with the electronic device. When the FE or body information crosses a second level, the processor can end generation of the first signals or modify the first signals. An image processing or eye gaze tracking system can provide some FE information or body information. The signals can modify attributes of an object of interest.
    Type: Grant
    Filed: September 5, 2017
    Date of Patent: January 29, 2019
    Inventor: Uday Parshionikar
  • Publication number: 20180364810
    Abstract: Methods of interpreting user actions for controlling electronic devices, as well as apparatuses and systems implementing the methods. User actions can involve eyes, head, face, fingers, hands, arms, other body parts, as well as facial, verbal and/or mental actions. Hand and/or voice free control of devices. Methods for large as well as fine motion and placement of objects. Use of actions before and during other actions to confirm the intent of the other actions. Objects can be moved or warped based on combinations of actions. Measurement of eye gaze, and iterations and helper signals for improved accuracy of control. Triggers to start and end recognition of user actions and generation of device commands.
    Type: Application
    Filed: March 14, 2018
    Publication date: December 20, 2018
    Inventor: Uday Parshionikar
  • Patent number: 10137363
    Abstract: User interaction concepts, principles and algorithms for gestures involving facial expressions, mental activity, and other body actions are disclosed. User interaction concepts, principles and algorithms for enabling hands-free and voice-free interaction with electronic devices are disclosed. Apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions, implementing the disclosed concepts, principles and algorithms are disclosed. Gestures for systems using eye gaze and head tracking that can be used with augmented, mixed or virtual reality, mobile or desktop computing are disclosed. Use of periods of limited activity and consecutive body actions in orthogonal axes is disclosed.
    Type: Grant
    Filed: March 24, 2017
    Date of Patent: November 27, 2018
    Inventor: Uday Parshionikar
  • Publication number: 20170371421
    Abstract: Method and apparatus is disclosed for a user to communicate with an electronic device. A processor receives user intention actions comprising facial expression (FE) information indicative of facial expressions and body information indicative of motion or position of one or more body parts of the user. When the FE or body information crosses a first level, the processor starts generating first signals based on the FE or body information to communicate with the electronic device. When the FE or body information crosses a second level, the processor can end generation of the first signals or modify the first signals. An image processing or eye gaze tracking system can provide some FE information or body information. The signals can modify attributes of an object of interest.
    Type: Application
    Filed: September 5, 2017
    Publication date: December 28, 2017
    Inventor: Uday Parshionikar
  • Patent number: 9785242
    Abstract: Method and apparatus is disclosed for a user to communicate with an electronic device. A processor receives user intention actions comprising facial expression (FE) information indicative of facial expressions and body information indicative of motion or position of one or more body parts of the user. When the FE or body information crosses a first level, the processor starts generating first signals based on the FE or body information to communicate with the electronic device. When the FE or body information crosses a second level, the processor can end generation of the first signals or modify the first signals. An image processing or eye gaze tracking system can provide some FE information or body information. The signals can modify attributes of an object of interest.
    Type: Grant
    Filed: October 15, 2013
    Date of Patent: October 10, 2017
    Inventor: Uday Parshionikar
  • Publication number: 20170249009
    Abstract: User interaction concepts, principles and algorithms for gestures involving facial expressions, mental activity, and other body actions are disclosed. User interaction concepts, principles and algorithms for enabling hands-free and voice-free interaction with electronic devices are disclosed. Apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions, implementing the disclosed concepts, principles and algorithms are disclosed. Gestures for systems using eye gaze and head tracking that can be used with augmented, mixed or virtual reality, mobile or desktop computing are disclosed. Use of periods of limited activity and consecutive body actions in orthogonal axes is disclosed.
    Type: Application
    Filed: March 24, 2017
    Publication date: August 31, 2017
    Inventor: Uday Parshionikar