Abstract: The present disclosure is generally directed to systems and methods for providing haptic effects based on information complementary to multimedia content. For example, one disclosed method includes the steps of receiving multimedia data comprising multimedia content and complementary data, wherein the complementary data describes the multimedia content, determining a haptic effect based at least in part on the complementary data, and outputting the haptic effect while playing the multimedia content.
Type:
Application
Filed:
February 6, 2018
Publication date:
June 7, 2018
Applicant:
Immersion Corporation
Inventors:
Vincent Levesque, Ali Modarres, Juan Manuel Cruz-Hernandez, Jamal Saboune
Abstract: One illustrative system disclosed herein includes a curved device that includes a curved outer housing. The illustrative system also includes a sensor configured to detect a user interaction with the curved device and transmit a sensor signal associated with the user interaction. The illustrative system additionally includes a processor in communication with the sensor, the processor configured to: receive the sensor signal from the sensor; determine a user interaction based on the sensor signal, determine a first haptic effect based at least in part on the user interaction, and transmit a haptic signal associated with the first haptic effect. The illustrative system also includes a haptic output device configured to receive the haptic signal and output the first haptic effect.
Type:
Application
Filed:
February 6, 2018
Publication date:
June 7, 2018
Applicant:
Immersion Corporation
Inventors:
Vincent Levesque, Danny Grant, Ali Modarres, Jamal Saboune
Abstract: A system for generating haptic effects includes a virtual environment having environmental properties, virtual objects, and object property information. A programmatic virtual sensor is placed on a virtual object in the virtual environment. A rendering engine for the virtual environment renders the virtual environment. A module for the virtual sensor receives virtual sensor data including position and time for the sensor and calculates sensor output data including acceleration data and object interaction data for the virtual sensor. A haptic track generator generates a haptic track based on the sensor output data.
Abstract: A method of producing a haptic effect includes receiving a sensory content signal from a user interface device, receiving a sensor signal of a body position of a first body part of a user with respect to a second body part of the user, generating the haptic effect using the sensory content signal and the sensor signal, and applying a drive signal to a haptic actuator to produce the haptic effect.
Type:
Grant
Filed:
September 23, 2013
Date of Patent:
June 5, 2018
Assignee:
Immersion Corporation
Inventors:
Danny A. Grant, Juan Manuel Cruz-Hernandez
Abstract: A method for encoding haptic information inside a multi-media file having content includes changing a portion of the content in the multi-media file, and adding the haptic information to the changed portion of the content, the haptic information corresponding to a haptic signal for generating a haptic effect upon playback of the multi-media file. A method for decoding haptic information from inside a multi-media file having content includes locating the haptic information inside the multi-media file, and generating a haptic signal based on the located haptic information during playback of the content of the multi-media file. A method includes receiving a multi-media signal comprising an audio signal and a haptic signal with a receiver of a haptic device, and outputting a haptic effect with a haptic output device of the haptic device based on the haptic signal in the multi-media signal.
Type:
Grant
Filed:
March 15, 2013
Date of Patent:
June 5, 2018
Assignee:
IMMERSION CORPORATION
Inventors:
Arnab Sen, Christopher J. Ullrich, Juan Manuel Cruz-Hernandez, Ali Modarres
Abstract: One illustrative computing device disclosed herein includes a sensor configured to detect a user interaction with a physical object and transmit a sensor signal associated with the user interaction. The illustrative computing device also includes a processor in communication with the sensor, the processor configured to: receive the sensor signal; determine a characteristic of the physical object based on the sensor signal; determine a haptic effect associated with the characteristic; and transmit a haptic signal associated with the haptic effect. The illustrative computing device further includes a haptic output device in communication with the processor, the haptic output device configured to receive the haptic signal and output the haptic effect.
Type:
Grant
Filed:
February 6, 2017
Date of Patent:
June 5, 2018
Assignee:
Immersion Corporation
Inventors:
Vincent Levesque, Wei Zhu, Eric Gervais, Fengtian An, Eric Lajeunesse, Johnny Maalouf
Abstract: One illustrative system disclosed herein includes a sensor configured to detect a gesture and transmit an associated sensor signal. The gesture includes a first position at a distance from a surface and a second position contacting the surface. The system also includes a processor in communication with the sensor and configured to: receive the sensor signal from the sensor, and determine one or more haptic effects based at least in part on the sensor signal. The one or more haptic effects are configured to provide substantially continuous haptic feedback throughout the gesture. The processor is also configured to generate one or more haptic signals based at least in part on the one or more haptic effects, and transmit the one or more haptic signals. The system includes a haptic output device for receiving the one or more haptic signals and outputting the one or more haptic effects.
Type:
Grant
Filed:
December 11, 2015
Date of Patent:
June 5, 2018
Assignee:
Immersion Corporation
Inventors:
Vahid Khoshkava, Vincent Levesque, Juan Manuel Cruz-Hernandez, Mansoor Alghooneh, William Rihn
Abstract: A system for generating haptic effects receives haptic permissions settings and associates the haptic permissions settings with a range of permitted haptic parameters. The system receives haptic parameters and modifies/filters the haptic parameters based on the range of permitted haptic parameters. The system then generates a haptic signal based on the modified haptic parameters and outputs the haptic signal to a haptic output device to generate the haptic effects.
Type:
Grant
Filed:
April 21, 2016
Date of Patent:
June 5, 2018
Assignee:
IMMERSION CORPORATION
Inventors:
David M. Birnbaum, Stephen D. Rank, Leonard Soskin, Danny A. Grant, Robert W. Heubel
Abstract: A system performs haptic challenge-response functionality. The system generates one or more haptic effects, provides the one or more haptic effects as a haptic challenge question to a user via a haptic output device, and receives an answer from the user corresponding to the haptic challenge question. The system then determines, based on a model of human perception, whether the answer corresponds to a correct answer to the haptic challenge question. One embodiment predicts the correct answer to the haptic challenge question, compares the correct answer with the answer received from the user, and determines that the user is a human when the answer matches the correct answer. One embodiment repeats the generating, the providing, the receiving, the predicting, and the comparing when the answer does not match the correct answer.
Abstract: Systems, methods, and associated software are described herein for enabling a regular user of an end user device, such as a cellular telephone, to customize parameters associated with haptic effects applied to the user by the end user device. In one implementation, among several, a method described herein includes enabling a user of an end user device to access software adapted to design or modify haptic effects of the end user device. The method further includes enabling the user to open a haptic track file and enter or modify parameters associated with the haptic effects of the opened haptic track file.
Type:
Grant
Filed:
December 5, 2013
Date of Patent:
June 5, 2018
Assignee:
IMMERSION CORPORATION
Inventors:
Erin B. Ramsay, Robert W. Heubel, Jason D. Fleming, Stephen D. Rank
Abstract: A user interface device includes a flexible layer comprising a touch surface configured to receive a touch by a user, a plurality of haptic cells covered by the flexible layer, each haptic cell comprising a haptic output device, a sensor configured to sense an amount and/or rate of deformation of the flexible layer when a user touches the touch surface, and a processor configured to receive an output signal from the sensor, generate a haptic control signal based on the output signal from the sensor, and output the haptic control signal to at least one haptic output device of the plurality of haptic cells to cause the haptic output device to deform an associated haptic cell in response to the sensed deformation of the flexible layer.
Type:
Grant
Filed:
June 28, 2016
Date of Patent:
May 29, 2018
Assignee:
IMMERSION CORPORATION
Inventors:
Ali Modarres, Juan Manuel Cruz-Hernandez, Danny A. Grant, Vincent Levesque
Abstract: A haptic peripheral includes a housing and a haptically-enhanced user input element. The haptically-enhanced user input element is configured to receive an input from a user, and includes a mechanical key having a keycap with a user contact surface configured to contact the user and a smart material actuator integrated onto the user contact surface of the keycap. The smart material actuator is configured to receive a control signal from a processor and is configured to deform at least a portion of the user contact surface relative to the keycap of the mechanical key in response to the control signal from the processor to thereby provide a haptic effect to a user of the haptic peripheral. The haptic peripheral may also include a braking actuator coupled to the mechanical key to hold the mechanical key in a depressed position to indicate an inactive status to a user. In addition, the haptic peripheral and the haptically-enhanced user input element may be modular.
Abstract: A user interface device having a user input component, a mechanical metamaterial region, one or more actuators, and a control unit is presented. The mechanical metamaterial region is located over the user input component. The one or more actuators are coupled to the mechanical metamaterial region, which has an internal structure that is mechanically alterable with the one or more actuators, and has a mechanical property that changes in response to the alteration of the internal structure by the one or more actuators. The control unit is in communication with the one or more actuators, and is configured to determine whether the user input component is to be hidden from tactile perception, and to activate the one or more actuators to mechanically alter the internal structure of the mechanical metamaterial region.
Type:
Grant
Filed:
May 1, 2017
Date of Patent:
May 29, 2018
Assignee:
Immersion Corporation
Inventors:
Jamal Saboune, Juan Manuel Cruz-Hernandez, Vahid Khoshkava
Abstract: One illustrative electrostatic actuator disclosed herein includes a first electrode, a second electrode, a first insulation layer between the first electrode and the second electrode, a first resilient material between the first electrode and the second electrode, a third electrode, a second insulation layer between the second electrode and the third electrode, and a second resilient material between the second electrode and the third electrode. The first electrode and the third electrode receive power from a power supply and responsively generate a first polarity. The second electrode receives power from the power supply and responsively generates a second polarity that is opposite the first polarity. The first polarity and the second polarity generate a first attractive force between the first electrode and the second electrode and a second attractive force between the second electrode and the third electrode. The electrostatic actuator may be part of a user interface.
Abstract: A haptic output device includes a touch surface, a sensor configured to sense an input at the touch surface, and a controller configured to read the sensor, identify a location of the input, switch from a read mode to a write mode, and write a voltage based on the location of the input to generate an electrostatic output.
Type:
Grant
Filed:
March 23, 2016
Date of Patent:
May 29, 2018
Assignee:
Immersion Corporation
Inventors:
Juan Manuel Cruz-Hernandez, Danny A. Grant
Abstract: Systems and methods for monitoring insulation integrity for electrostatic friction are disclosed. One system may include a touch sensitive interface configured to detect user interaction; an electrostatic haptic output device configured to output one or more electrostatic haptic effects to the touch sensitive interface; a processor in communication with the touch sensitive interface and the electrostatic haptic output device, the processor configured to: determine an operating condition associated with the electrostatic haptic output device; determine a corrective action associated with the operating condition; and apply the corrective action.
Abstract: A method and apparatus for generating haptic surface texture with a deformable surface layer are disclosed. The haptic device includes a flexible surface layer, a haptic substrate, and a deforming mechanism. The flexible surface layer is made of elastic materials and is capable of reconfiguring its surface characteristics. The haptic substrate, in one embodiment, provides a first pattern in response to a first activating signal. Alternatively, the haptic substrate is capable of providing a second pattern in accordance with a second activating signal. The deforming mechanism is configured to change the flexible surface from a first surface characteristic to a second surface characteristic in accordance with the first pattern.
Type:
Application
Filed:
November 13, 2017
Publication date:
May 24, 2018
Applicant:
Immersion Corporation
Inventors:
Robert W. Heubel, Ryan Steger, Robert A. Lacroix, Muge Bakircioglu
Abstract: A keyless entry device is provided. The keyless entry device includes a transceiver, a haptic actuator coupled to a drive circuit, and a processor coupled to the transceiver and the drive circuit. The transceiver communicates with a vehicle over a communication channel. The processor determines proximity information between the keyless entry device and the vehicle, selects a control signal based on the proximity information, and outputs the control signal to the drive circuit to cause the haptic actuator to periodically or continuously generate a haptic effect to a user.
Type:
Grant
Filed:
April 30, 2017
Date of Patent:
May 22, 2018
Assignee:
IMMERSION CORPORATION
Inventors:
Natasha Margaret Minenko Flaherty, David M. Birnbaum
Abstract: A haptic system includes a structure having a conductive layer and a reactive layer. The conductive layer is coupled to a power source and the reactive layer is coupled to a switch having a first state and a second state. The power source enables the conductive layer to generate a charge. The first state of the switch operates the reactive layer to block the establishment of a tissue-stimulating electric field. The second state of the switch operates the reactive layer to enable the establishment of a tissue-stimulating electric field to generate a touchless haptic effect.
Abstract: Systems and methods for visual processing of spectrograms to generate haptic effects are disclosed. In one embodiment, a signal comprising at least an audio signal is received. One or more spectrograms may be generated based at least in part on the received signal. One or more haptic effects may be determined based at least in part on the spectrogram. For example, a generated spectrogram may be a two-dimensional image and this image can be analyzed to determine one or more haptic effects. Once a haptic effect has been determined, one or more haptic output signals can be generated. A generated haptic output signal may be output to one or more haptic output devices.
Type:
Application
Filed:
January 4, 2018
Publication date:
May 17, 2018
Applicant:
Immersion Corporation
Inventors:
Juan Manuel Cruz-Hernandez, Jamal Saboune