Patents by Inventor SANYA ATTARI
SANYA ATTARI has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11086403Abstract: Systems and methods for multi-user shared virtual and augmented reality-based haptics are disclosed. One illustrative method for multi-user shared virtual and augmented reality-based haptics includes determining a position of an object; determining a viewpoint of at least one observer with respect to the object; determining a haptic effect to be output based at least in part on the position and the viewpoint; and outputting the haptic effect.Type: GrantFiled: April 27, 2020Date of Patent: August 10, 2021Assignee: Immersion CorporationInventors: William S. Rihn, David M. Birnbaum, Shadi Asfour, Satoshi Araki, Sanya Attari
-
Patent number: 10943445Abstract: Systems and methods for providing haptic effects with airflow and thermal stimulation are disclosed. One illustrative system described herein includes a haptic output device comprising a thermal actuator and a processor communicatively coupled to the haptic output device and configured to: receive a sensor signal from at least one sensor, determine a heat flux property based in part on the display signal or a predefined parameter, determine a haptic effect based in part on the heat flux property and the sensor signal, the heat flux property being representative of a rate of change of temperature, and transmit a haptic signal associated with the haptic effect to the haptic output device.Type: GrantFiled: May 13, 2019Date of Patent: March 9, 2021Assignee: Immersion CorporationInventors: David M. Birnbaum, Sanya Attari, Hossam Bahlool, Doug Billington, Bruno Garrido
-
Publication number: 20200393903Abstract: Multi-directional kinesthetic actuation systems are provided. The multi-directional kinesthetic actuation systems are configured to provide kinesthetic effects in multiple directions through both pulling and pushing forces. Multi-directional kinesthetic actuation systems include at least an active linkage, one or more hinges, and a motor. The motor is employed to advance or retract the active linkage. The active linkage is activated to provide increased buckling strength to transfer force to the hinges and deactivated to increase flexibility to facilitate retraction by the motor. The hinges are configured to translate the pushing or pulling force provided by the active linkage into a torque to be provided to a user's finger.Type: ApplicationFiled: June 11, 2019Publication date: December 17, 2020Inventors: Vahid KHOSHKAVA, Robert LACROIX, Colin SWINDELLS, Sanya ATTARI
-
Publication number: 20200364994Abstract: Systems and methods for providing haptic effects with airflow and thermal stimulation are disclosed. One illustrative system described herein includes a haptic output device comprising a thermal actuator and a processor communicatively coupled to the haptic output device and configured to: receive a sensor signal from at least one sensor, determine a heat flux property based in part on the display signal or a predefined parameter, determine a haptic effect based in part on the heat flux property and the sensor signal, the heat flux property being representative of a rate of change of temperature, and transmit a haptic signal associated with the haptic effect to the haptic output device.Type: ApplicationFiled: May 13, 2019Publication date: November 19, 2020Applicant: Immersion CorporationInventors: David M. Birnbaum, Sanya Attari, Hossam Bahlool, Doug Billington, Bruno Garrido
-
Patent number: 10775892Abstract: Systems and methods for multi-user shared virtual and augmented reality-based haptics are disclosed. One illustrative method for multi-user shared virtual and augmented reality-based haptics includes determining a position of an object; determining a viewpoint of at least one observer with respect to the object; determining a haptic effect to be output based at least in part on the position and the viewpoint; and outputting the haptic effect.Type: GrantFiled: April 20, 2018Date of Patent: September 15, 2020Assignee: Immersion CorporationInventors: William S. Rihn, David M. Birnbaum, Shadi Asfour, Satoshi Araki, Sanya Attari
-
Publication number: 20200257367Abstract: Systems and methods for multi-user shared virtual and augmented reality-based haptics are disclosed. One illustrative method for multi-user shared virtual and augmented reality-based haptics includes determining a position of an object; determining a viewpoint of at least one observer with respect to the object; determining a haptic effect to be output based at least in part on the position and the viewpoint; and outputting the haptic effect.Type: ApplicationFiled: April 27, 2020Publication date: August 13, 2020Applicant: Immersion CorporationInventors: William S. Rihn, David M. Birnbaum, Shadi Asfour, Satoshi Araki, Sanya Attari
-
Publication number: 20200201438Abstract: The present invention provides a haptic fiducial sticker for an augmented reality (AR) environment. The haptic fiducial sticker includes a touch sensor, a wireless communication interface, and a haptic output device. The touch sensor is configured to detect a touch or user contact. The wireless communication interface is configured to transmit a unique identifier (UID) and receive haptic content associated with the UID, the haptic content including a haptic effect. The haptic output device is configured to render the haptic effect when the touch sensor detects the touch or user contact.Type: ApplicationFiled: December 24, 2018Publication date: June 25, 2020Inventors: Alexia Mandeville, Sanya Attari, Douglas G. Billington, Christopher J. Ullrich
-
Patent number: 10665067Abstract: Systems and methods for integrating haptics overlay in augmented reality are disclosed. One illustrative system described herein includes a haptic output device. The system also includes a display configured to output a visual effect. The system also includes a sensor for tracking a position of a proxy object. The system also includes a processor configured to: determine a modified visual effect based in part on data received from the sensor, determine a haptic effect based in part on data received from the sensor, transmit a display signal associated with the modified visual effect to the display, transmit a haptic signal associated with the haptic effect to the haptic output device; and output the haptic effect using the haptic output device.Type: GrantFiled: June 15, 2018Date of Patent: May 26, 2020Assignee: Immersion CorporationInventors: Satoshi Araki, Christopher J. Ullrich, Liwen Wu, Juan Manuel Cruz-Hernandez, Danny A. Grant, Sanya Attari, Colin Swindells
-
Patent number: 10583359Abstract: A wearable device for providing haptic effects includes a wearable housing configured to be worn on to a portion of a hand of a wearer and an actuator secured to the wearable housing. The wearable housing includes a first digit segment configured to conform to a finger of the hand and a second digit segment configured to conform to another finger or a thumb of the hand. The actuator is configured to receive a command signal indicative of a virtual interaction related to touching or grasping a virtual object. In response to the command signal, the actuator provides a force onto the portion of the hand or provides a force to render a resistance to movement of the first and second digit segments toward each other. The actuator uses the wearable housing to mechanically stabilize the force towards the portion of the hand.Type: GrantFiled: December 28, 2017Date of Patent: March 10, 2020Assignee: IMMERSION CORPORATIONInventors: Robert Heubel, Juan Manuel Cruz-Hernandez, Vahid Khoshkava, Sanya Attari, Colin Swindells, Satoshi Araki
-
Publication number: 20190385419Abstract: Systems and methods for integrating haptics overlay in augmented reality are disclosed. One illustrative system described herein includes a haptic output device. The system also includes a display configured to output a visual effect. The system also includes a sensor for tracking a position of a proxy object. The system also includes a processor configured to: determine a modified visual effect based in part on data received from the sensor, determine a haptic effect based in part on data received from the sensor, transmit a display signal associated with the modified visual effect to the display, transmit a haptic signal associated with the haptic effect to the haptic output device; and output the haptic effect using the haptic output device.Type: ApplicationFiled: June 15, 2018Publication date: December 19, 2019Inventors: Satoshi Araki, Christopher J. Ullrich, Liwen Wu, Juan Manuel Cruz-Hernandez, Danny A. Grant, Sanya Attari, Colin Swindells
-
Publication number: 20190324541Abstract: Systems and methods for multi-user shared virtual and augmented reality-based haptics are disclosed. One illustrative method for multi-user shared virtual and augmented reality-based haptics includes determining a position of an object; determining a viewpoint of at least one observer with respect to the object; determining a haptic effect to be output based at least in part on the position and the viewpoint; and outputting the haptic effect.Type: ApplicationFiled: April 20, 2018Publication date: October 24, 2019Applicant: Immersion CorporationInventors: William S. Rihn, David M. Birnbaum, Shadi Asfour, Satoshi Araki, Sanya Attari
-
Publication number: 20190324549Abstract: Systems, devices, and methods for providing immersive reality interface modes are disclosed. The devices include haptic actuators, a computing unit, a gesture detection system, and an immersive reality display device. The gesture detection system detects gestures made by a user during interaction with an immersive realty environment, which is displayed by the immersive reality display device. The computing unit generates and operates an interface mode by which the user can interact, through gestures, with the immersive reality environment and coordinates the provision of haptic, visual, and audio outputs.Type: ApplicationFiled: April 20, 2018Publication date: October 24, 2019Inventors: Satoshi ARAKI, William S. RIHN, Sanya ATTARI, David M. BIRNBAUM
-
Publication number: 20190204929Abstract: Devices and methods for dynamic association of user inputs to mobile device actions are provided. User sensing panels associated with a mobile device may detect the presence or contact of multiple hand parts. A signal from the user sensing panels indicative of the presence or contact may be associated with a mobile device action. To associate the signal with the mobile device action, a processor associated with the mobile device may determine the identities of the hand part and may recognize user gestures of the identified hand parts. A processor may cause the execution of the mobile device action upon receipt of the signal.Type: ApplicationFiled: December 29, 2017Publication date: July 4, 2019Inventors: Sanya ATTARI, Colin Swindells
-
Publication number: 20190201785Abstract: A wearable device for providing haptic effects includes a wearable housing configured to be worn on to a portion of a hand of a wearer and an actuator secured to the wearable housing. The wearable housing includes a first digit segment configured to conform to a finger of the hand and a second digit segment configured to conform to another finger or a thumb of the hand. The actuator is configured to receive a command signal indicative of a virtual interaction related to touching or grasping a virtual object. In response to the command signal, the actuator provides a force onto the portion of the hand or provides a force to render a resistance to movement of the first and second digit segments toward each other. The actuator uses the wearable housing to mechanically stabilize the force towards the portion of the hand.Type: ApplicationFiled: December 28, 2017Publication date: July 4, 2019Inventors: Robert Heubel, Juan Manuel CRUZ-HERNANDEZ, Vahid KHOSHKAVA, Sanya ATTARI, Colin Swindells, Satoshi ARAKI
-
Publication number: 20180373325Abstract: A method and system of generating haptic effects using haptic dimensions in a virtual environment is presented. The method includes identifying a point of interest within a virtual environment and generating a plurality of haptic dimensions based on the point of interest. The haptic dimensions define a point of interest region. Additionally, a gaze orientation of a user is determined and based on that gaze orientation, an amount of the user's gaze that is directed to the point of interest region is determined. If the amount of the user's gaze directed to the point of interest region is below a threshold amount, a haptic effect is generated.Type: ApplicationFiled: June 22, 2017Publication date: December 27, 2018Inventors: Jared C. ROSSO, William S. RIHN, Sanya ATTARI
-
Publication number: 20180011538Abstract: Embodiments generate haptic effects in response to a user input (e.g., pressure based or other gesture). Embodiments receive a first input range corresponding to user input and receive a haptic profile corresponding to the first input range. During a first dynamic portion of the haptic profile, embodiments generate a dynamic haptic effect that varies based on values of the first input range during the first dynamic portion. Further, at a first trigger position of the haptic profile, embodiments generate a triggered haptic effect.Type: ApplicationFiled: July 7, 2017Publication date: January 11, 2018Inventors: WILLIAM S. RIHN, SANYA ATTARI, LIWEN WU, MIN LEE, DAVID BIRNBAUM