Patents Assigned to Immersion
  • Publication number: 20140218536
    Abstract: A video/audio system includes an interface device that receives a plurality of audio and video signals from a plurality of sources. The interface device combines these signals into various combinations and transmits the combinations to a receiver. The receiver is configured to interface one of the combinations of signals with a user. In this regard, the receiver allows the user to select one of the combinations, and in response, the receiver separates the video signal(s) of the selected combination from the audio signal(s) of the selected combination. Then, the receiver renders the video signal(s) via a display device and produces a sound defined by the audio signal(s) via a speaker. Accordingly, the user is able to control which set of audio and video signals are interfaced with the user.
    Type: Application
    Filed: April 4, 2014
    Publication date: August 7, 2014
    Applicant: IMMERSION ENTERTAINMENT, LLC
    Inventors: Tazwell L. Anderson, JR., Mark A. Wood
  • Publication number: 20140218184
    Abstract: A system for managing a plurality of wearable devices on a user receives information to be conveyed using haptic effects and determines an intent of the information. The system then determines, for each of the plurality of wearable haptic devices, a location of the wearable haptic device on the user and a haptic capability. The system then maps the information as a haptic effect to one or more of the wearable haptic devices based at least on the determined locations on the user and the haptic capabilities.
    Type: Application
    Filed: February 4, 2013
    Publication date: August 7, 2014
    Applicant: IMMERSION CORPORATION
    Inventors: Danny GRANT, Juan Manuel CRUZ-HERNANDEZ
  • Patent number: 8797352
    Abstract: The invention relates to a method and devices for enabling a user to visualize a virtual model in a real environment. According to the invention, a 2D representation of a 3D virtual object is inserted, in real-time, into the video flows of a camera aimed at a real environment in order to form an enriched video flow. A plurality of cameras generating a plurality of video flows can be simultaneously used to visualize the virtual object in the real environment according to different angles of view. A particular video flow is used to dynamically generate the effects of the real environment on the virtual model. The virtual model can be, for example, a digital copy or virtual enrichments of a real copy. A virtual 2D object, for example the representation of a real person, can be inserted into the enriched video flow.
    Type: Grant
    Filed: August 9, 2006
    Date of Patent: August 5, 2014
    Assignee: Total Immersion
    Inventors: Valentin Lefevre, Jean-Marie Vaidie
  • Patent number: 8791799
    Abstract: A system that generates a haptic effect using an Eccentric Rotating Mass (“ERM”) actuator determines a back electromotive force (“EMF”) of the ERM actuator and receives a haptic effect signal comprising one or more parameters, where one of the parameters is a voltage amplitude level as a function of time. The system varies the voltage amplitude level based at least on the back EMF, and applies the varied haptic effect signal to the ERM actuator.
    Type: Grant
    Filed: January 31, 2013
    Date of Patent: July 29, 2014
    Assignee: Immersion Corporation
    Inventors: Robert A. Lacroix, Michael A. Greenish, Erin B. Ramsay
  • Publication number: 20140205260
    Abstract: A system includes a video recorder configured to record video data, a sensor configured to sense movement of an object and output sensor data representative of the movement of the object, a transformer configured to transform the sensor data into a haptic output signal, a haptic output device configured to generate a haptic effect to a user based on the haptic output signal, a display configured to display a video, and a processor configured to synchronize the video data and the haptic output signal, and output the video data to the display and the haptic output signal to the haptic output device so that the haptic effect is synchronized with the video displayed on the display.
    Type: Application
    Filed: March 14, 2013
    Publication date: July 24, 2014
    Applicant: IMMERSION CORPORATION
    Inventors: Robert Lacroix, Juan Manuel Cruz-Hernandez, Jamal Saboune
  • Publication number: 20140208204
    Abstract: A haptic device includes a display configured to display an image, a haptic output device configured to generate a haptic effect to a user when the user interacts with the display, and a processor configured to receive information related to the image displayed on the display. The processor is also configured to create a friction based haptic effect map associated with the image displayed on the display, and generate a signal to the haptic output device to output the haptic effect when the user interacts with the display when the image is displayed on the display, the haptic effect being configured to simulate a feel of the image in three dimensions.
    Type: Application
    Filed: March 11, 2013
    Publication date: July 24, 2014
    Applicant: IMMERSION CORPORATION
    Inventors: Robert Lacroix, Vincent Levesque
  • Publication number: 20140204079
    Abstract: A system (10) for displaying at least one virtual object includes a secondary screen (20) for displaying the virtual object, a primary screen (30), an optical element for overlaying images displayed on the secondary screen (20) with images displayed on the primary screen (30), and a pointing surface combined with the primary screen (30) for detecting the contact of one or more physical pointing elements. A device (90) for manipulating at least one virtual object includes calculation elements for generating images of the virtual object displayed on the system (10) from information output from the system (10) in accordance with the actions of the operator (100).
    Type: Application
    Filed: June 15, 2012
    Publication date: July 24, 2014
    Applicants: Immersion, Inria-Institut National De Recherche en Informatiq ue Et En Automatiq
    Inventors: Jean-Baptiste De La Riviere, Christophe Chartier, Martin Hachet, Benoit Bossavit, Gery Casiez
  • Patent number: 8788253
    Abstract: Embodiments of the invention relate to methods and systems for providing haptic feedback to a user interacting with a simulated (or “virtual”) pet, so as to enhance the realism of the user's relationship with the virtual pet. In one embodiment, a method of providing haptic feedback to a user interacting with a virtual pet comprises: receiving a signal relating to a biological status of the virtual pet, and outputting a haptic effect based on the received signal on a user.
    Type: Grant
    Filed: October 30, 2002
    Date of Patent: July 22, 2014
    Assignee: Immersion Corporation
    Inventor: Louis B. Rosenberg
  • Publication number: 20140198130
    Abstract: A device may be configured to provide feedback based on an augmented reality environment. The device may comprise, for example, a processor configured to receive a control signal from an augmented reality device and a feedback device configured to provide a feedback based on the received control signal. The augmented reality device may generate an augmented reality environment and may be remote from the device. The control signal received by the device may be representative of an event occurring in the augmented reality environment. The augmented reality environment may include a physical space in which at least one physical object exists and an augmented reality space in which one or more virtual objects that augment the physical object are displayed.
    Type: Application
    Filed: January 15, 2013
    Publication date: July 17, 2014
    Applicant: IMMERSION CORPORATION
    Inventor: Robert LACROIX
  • Publication number: 20140195906
    Abstract: Systems, methods, and associated software are described herein for enabling a regular user of an end user device, such as a cellular telephone, to customize parameters associated with haptic effects applied to the user by the end user device. In one implementation, among several, a method described herein includes enabling a user of an end user device to access software adapted to design or modify haptic effects of the end user device. The method further includes enabling the user to open a haptic track file and enter or modify parameters associated with the haptic effects of the opened haptic track file.
    Type: Application
    Filed: February 26, 2014
    Publication date: July 10, 2014
    Applicant: Immersion Corporation
    Inventors: Erin B. RAMSAY, Robert W. HEUBEL, Jason D. FLEMING, Stephen D. RANK
  • Patent number: 8773356
    Abstract: Systems and methods for providing tactile sensations are disclosed. For example, one disclosed method includes the steps of outputting a display signal configured to display a graphical object on a touch-sensitive input device; receiving a sensor signal from the touch-sensitive input device, the sensor signal indicating an object contacting the touch-sensitive input device; determining an interaction between the object contacting the touch-sensitive input device and the graphical object; and generating an actuator signal based at least in part on the interaction.
    Type: Grant
    Filed: January 31, 2012
    Date of Patent: July 8, 2014
    Assignee: Immersion Corporation
    Inventors: Kenneth M. Martin, Steven P. Vassallo, Alex S. Goldenberg, Alexander Jasso, Kollin M. Tierling
  • Patent number: 8773247
    Abstract: Haptic output devices and related systems and methods are described in the present disclosure. In various implementations, a haptic output device includes a reservoir filled with a liquid. At least one side of the reservoir includes a flexible membrane. The haptic output device also includes a first actuator in physical contact with the reservoir and configured to impart pressure waves to the liquid. The pressure waves interact with the flexible membrane to supply a haptic effect to a user.
    Type: Grant
    Filed: December 15, 2009
    Date of Patent: July 8, 2014
    Assignee: Immersion Corporation
    Inventor: Christopher J. Ullrich
  • Publication number: 20140189506
    Abstract: Embodiments of systems and methods for interpreting physical interactions with a graphical user interface are disclosed. For example, one system for interpreting physical interactions with a graphical user interface is a device having a housing configured to be grasped by a user, a display disposed in the housing, the display configured to display a graphical user interface, and a sensor disposed in the housing, the sensor configured to detect a movement of the housing in a degree of freedom. The device also includes a processor disposed in the housing and in communication with the display and the sensor, the processor configured to receive a sensor signal from the sensor, the sensor signal comprising a parameter associated with the movement, to determine a command associated with the graphical user interface based on the parameter, to determine a function to be executed based on the command, and to execute the function.
    Type: Application
    Filed: March 6, 2014
    Publication date: July 3, 2014
    Applicant: Immersion Corporation
    Inventors: David M. Birnbaum, Christopher J. Ullrich, Peter Rubin, Phong David Ngo, Leo Kopelow
  • Publication number: 20140184497
    Abstract: A system that produces a dynamic haptic effect and generates a drive signal that includes a gesture signal and a real or virtual device sensor signal. The haptic effect is modified dynamically based on both the gesture signal and the real or virtual device sensor signal such as from an accelerometer or gyroscope, or by a signal created from processing data such as still images, video or sound. The haptic effect may optionally be modified dynamically by using the gesture signal and the real or virtual device sensor signal and a physical model, or may optionally be applied concurrently to multiple devices which are connected via a communication link. The haptic effect may optionally be encoded into a data file on a first device. The data file is then communicated to a second device and the haptic effect is read from the data file and applied to the second device.
    Type: Application
    Filed: February 10, 2014
    Publication date: July 3, 2014
    Applicant: IMMERSION CORPORATION
    Inventors: David Birnbaum, Chris Ullrich, Jason Short, Ryan Devenish
  • Patent number: 8761915
    Abstract: In an embodiment, a system and method for automatically converting a plurality of events in a plurality of channels in a structured representation sequence into haptic events. The method comprises calculating an event score for each event of the sequence in one or more channels. The method also comprises calculating a cumulative score based on the event scores in the one or more channels. The method includes selectively designating haptic events to the events based on the event scores in one or more selected channels, wherein the haptic events are output by a haptic actuator. This may be done by the system by calculating properties of the sound or by taking already existing values associated with those properties to efficiently produce haptic events.
    Type: Grant
    Filed: April 29, 2011
    Date of Patent: June 24, 2014
    Assignee: Immersion Corporation
    Inventors: Christopher J. Ullrich, Stephen D. Rank, Munibe M. Bakircioglu
  • Publication number: 20140168091
    Abstract: A touchscreen generates two or more displays that are visible at different viewing angles, e.g., one is visible only from the driver's seat of a car and the other is visible only from the passenger seat of the car. The displays occupy overlapping areas on the display surface such that input controls for the first display may overlap with input controls for the second display. If one of the users engages the display, the user is identified, it is determined which display the user is viewing and then which input he may be supplying, and a haptic stimulus for that display may be generated.
    Type: Application
    Filed: December 13, 2012
    Publication date: June 19, 2014
    Applicant: IMMERSION CORPORATION
    Inventor: Trevor Jones
  • Publication number: 20140167941
    Abstract: A method of generating a haptic effect on a linear resonance actuator (“LRA”) having a resonant frequency includes receiving a haptic effect signal for the haptic effect, where the haptic effect comprises a desired frequency that is off-resonant from the LRA. The method further includes generating a first sine wave at the desired frequency and generating a second sine wave at or near the resonant frequency. The method further includes combining the first sine wave and the second sine wave to generate a drive signal.
    Type: Application
    Filed: December 13, 2013
    Publication date: June 19, 2014
    Applicant: Immersion Corporation
    Inventors: Stephen D. RANK, Erin B. RAMSAY, Henry DA COSTA, Arnab SEN, Elena Renee REDELSHEIMER
  • Patent number: 8754757
    Abstract: A system is provided that automatically generates one or more haptic effects from source data, such as audio source data. The system fits the one or more haptic effects to the source data by analyzing the source data and identifying one or more haptic effects that are the most similar to the source data. The system matches the identified one or more haptic effects with the source data. The system subsequently outputs the identified one or more haptic effects.
    Type: Grant
    Filed: March 5, 2013
    Date of Patent: June 17, 2014
    Assignee: Immersion Corporation
    Inventors: Christopher J Ullrich, Danny Grant, Victor Aaron Viegas, Juan Manuel Cruz-Hernandez
  • Patent number: 8754758
    Abstract: A system is provided that automatically generates one or more haptic effects from source data, such as audio source data. The system fits the one or more haptic effects to the source data by analyzing the source data and identifying one or more haptic effects that are the most similar to the source data. The system matches the identified one or more haptic effects with the source data. The system subsequently outputs the identified one or more haptic effects.
    Type: Grant
    Filed: March 7, 2013
    Date of Patent: June 17, 2014
    Assignee: Immersion Corporation
    Inventors: Christopher J Ullrich, Danny Grant, Victor Aaron Viegas, Juan Manuel Cruz-Hernandez
  • Publication number: 20140160034
    Abstract: A system is provided that generates a dynamic haptic effect that includes one or more key frames, where each key frame includes a first interpolant value and a first haptic effect. The system further receives an interpolant value, where the interpolant value is between at least two interpolant values of at least two key frames. The system further determines the dynamic haptic effect from the interpolant value. The system further distributes the dynamic haptic effect among a plurality of actuators.
    Type: Application
    Filed: December 10, 2012
    Publication date: June 12, 2014
    Applicant: IMMERSION CORPORATION
    Inventors: Henry DA COSTA, Eric GERVAIS, Satvir Singh BHATIA