Patents by Inventor Liwen Wu

Liwen Wu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10175760
    Abstract: A haptically enabled system receives a haptic effect primitive comprising a plurality of input parameters and receives an input from a sensor. The system generates a haptic effect signal from the haptic effect primitive, the haptic effect signal comprising a plurality of output parameters where at least one of the output parameters is varied based on the sensor input. The system then applies the haptic effect signal to an actuator.
    Type: Grant
    Filed: October 28, 2016
    Date of Patent: January 8, 2019
    Assignee: Immersion Corporation
    Inventors: Juan Manuel Cruz-Hernandez, Liwen Wu, Christopher J. Ullrich
  • Patent number: 10176680
    Abstract: A method of generating event identifiers includes receiving sensor information from tracked entities. Based on the sensor information for tracked entities, an event can be determined. An event ID can be assigned to the event based on the type of event that was determined. The event ID can be sent to a haptically enabled device, the device outputting a haptic effect determined from the event ID.
    Type: Grant
    Filed: June 19, 2017
    Date of Patent: January 8, 2019
    Assignee: IMMERSION CORPORATION
    Inventors: Jamal Saboune, Juan Manuel Cruz-Hernandez, Liwen Wu, Abdelwahab Hamam
  • Publication number: 20180321752
    Abstract: A system includes an electronic device that includes a display screen, a cover configured to cover the display screen, a sensor configured to sense an input gesture comprising deformation and/or movement of the cover relative to the electronic device, and a processor configured to determine an action for the electronic device to perform based on the input gesture, to determine a haptic effect to generate based on the input gesture and/or the action for the electronic device to perform, and to initiate the action. The system also includes a haptic output device configured to generate the haptic effect.
    Type: Application
    Filed: June 29, 2018
    Publication date: November 8, 2018
    Inventors: Vincent LEVESQUE, Jamal SABOUNE, Juan Manuel CRUZ HERNANDEZ, Abdelwahab HAMAM, Vahid KHOSHKAVA, Liwen WU
  • Publication number: 20180314333
    Abstract: Systems and methods for force-based object manipulation and haptic sensations are disclosed. One disclosed method includes the steps of receiving a first signal indicating a location of a user interaction and receiving a second signal indicating a first force. The method also includes, if the location of the user interaction corresponds to an object displayed on a display screen: outputting a first haptic signal to a haptic output device to cause a first haptic effect; and outputting a second haptic signal to the haptic output device to cause a second haptic effect if the first force meets or exceeds a first force threshold.
    Type: Application
    Filed: June 21, 2018
    Publication date: November 1, 2018
    Applicant: Immersion Corporation
    Inventors: Vincent Levesque, Juan Manuel Cruz-Hernandez, Danny Grant, Jamal Saboune, Liwen Wu, Kurt Eerik Stahlberg, Abdelwahab Hamam
  • Patent number: 10101813
    Abstract: Examples of devices, systems, and methods to automatically generate haptics based on visual color features and motion analysis are disclosed. In one example, a video having a plurality of frames is received and masked frames for the video are generated by applying a color mask to the plurality of frames. An event between two of the masked frames is detected and an optical flow estimate is generated for these masked frames. At least one haptic effect corresponding to the event is generated based on the optical flow. The generated haptic effect(s) may be output to a haptic file or a haptic output device, or both.
    Type: Grant
    Filed: December 14, 2016
    Date of Patent: October 16, 2018
    Assignee: IMMERSION CORPORATION
    Inventors: Liwen Wu, Jamal Saboune, Paige Raynes
  • Patent number: 10102723
    Abstract: A method or system that receives input media including at least video data in which a video event within the video data is detected. Related data that is associated with the detected video event is collected and one or more feature parameters are configured based on the collected related data. The type of video event is determining and a set of feature parameters is selected based on the type of video event. A haptic effect is then automatically generated based on the selected set of feature parameters.
    Type: Grant
    Filed: October 31, 2016
    Date of Patent: October 16, 2018
    Assignee: IMMERSION CORPORATION
    Inventor: Liwen Wu
  • Patent number: 10082874
    Abstract: A system is provided that converts an input, such as audio data, into one or more haptic effects. The system applies a granular synthesis algorithm to the input in order to generate a haptic signal. The system subsequently outputs the one or more haptic effects based on the generated haptic signal. The system can also shift a frequency of the input, and also filter the input, before the system applies the granular synthesis algorithm to the input.
    Type: Grant
    Filed: June 13, 2017
    Date of Patent: September 25, 2018
    Assignee: IMMERSION CORPORATION
    Inventors: Juan Manuel Cruz-Hernandez, Ali Modarres, Liwen Wu, David Birnbaum
  • Publication number: 20180232051
    Abstract: A method and system of dynamically generating localized haptic effects that includes receiving video data and detecting a video event within that video data. Information is collected including at least a position and type of the detected video event. The collection of information also includes at least a position and orientation of a user's avatar in the video data. The locations of a first and second haptic output device are determined. Haptic effects are dynamically generated for the first and second haptic output devices, wherein the dynamic generation of the haptic effects are based on the locations of the first and second haptic output devices, the position and orientation of the user's avatar in relationship to the position and the type of video event.
    Type: Application
    Filed: February 16, 2017
    Publication date: August 16, 2018
    Inventors: Liwen WU, Jamal SABOUNE
  • Patent number: 10031583
    Abstract: Systems and methods for force-based object manipulation and haptic sensations are disclosed. One disclosed method includes the steps of receiving a first signal indicating a location of a user interaction and receiving a second signal indicating a first force. The method also includes, if the location of the user interaction corresponds to an object displayed on a display screen: outputting a first haptic signal to a haptic output device to cause a first haptic effect; and outputting a second haptic signal to the haptic output device to cause a second haptic effect if the first force meets or exceeds a first force threshold.
    Type: Grant
    Filed: March 19, 2015
    Date of Patent: July 24, 2018
    Assignee: Immersion Corporation
    Inventors: Vincent Levesque, Juan Manuel Cruz-Hernandez, Danny Grant, Jamal Saboune, Liwen Wu, Kurt Eerik Stahlberg, Abdelwahab Hamam
  • Patent number: 10013060
    Abstract: A system includes an electronic device that includes a display screen, a cover configured to cover the display screen, a sensor configured to sense an input gesture comprising deformation and/or movement of the cover relative to the electronic device, and a processor configured to determine an action for the electronic device to perform based on the input gesture, to determine a haptic effect to generate based on the input gesture and/or the action for the electronic device to perform, and to initiate the action. The system also includes a haptic output device configured to generate the haptic effect.
    Type: Grant
    Filed: September 18, 2015
    Date of Patent: July 3, 2018
    Assignee: IMMERSION CORPORATION
    Inventors: Vincent Levesque, Jamal Saboune, Juan Manuel Cruz-Hernandez, Abdelwahab Hamam, Vahid Khoshkava, Liwen Wu
  • Publication number: 20180164887
    Abstract: Examples of devices, systems, and methods to automatically generate haptics based on visual color features and motion analysis are disclosed. In one example, a video having a plurality of frames is received and masked frames for the video are generated by applying a color mask to the plurality of frames. An event between two of the masked frames is detected and an optical flow estimate is generated for these masked frames. At least one haptic effect corresponding to the event is generated based on the optical flow. The generated haptic effect(s) may be output to a haptic file or a haptic output device, or both.
    Type: Application
    Filed: December 14, 2016
    Publication date: June 14, 2018
    Inventors: Liwen Wu, Jamal Saboune, Paige Raynes
  • Publication number: 20180165926
    Abstract: Examples of devices, systems, and methods to automatically generate haptics based on visual odometry are disclosed. In one example, a video having a plurality of frames is received and an optical flow estimate between a first frame from the plurality of frames and a second frame from the plurality of frames is created. In this example, the second frame is subsequent to the first frame. An apparent movement of a stationary object between the first frame and the second frame is detected based at least in part on the optical flow estimate in this example and at least one haptic effect corresponding to the apparent movement of the stationary object is generated based at least in part on the optical flow estimate. The generated haptic effect(s) may be output to a haptic file or a haptic output device, or both.
    Type: Application
    Filed: December 14, 2016
    Publication date: June 14, 2018
    Inventors: Liwen Wu, Jamal Saboune, Paige Raynes
  • Publication number: 20180167272
    Abstract: Example methods are provided for a first routing component to handle failure at a logical router in a first network. One method may comprise learning first path information associated with a first path provided by an active second routing component, and second path information associated with a second path provided by a standby second routing component. The method may also comprise in response to detecting a first egress packet destined for a second network, sending the first egress packet to the active second routing component based on the first path information. The method may further comprise in response to detecting a failure at the active second routing component and detecting a second egress packet destined for the second network, sending the second egress packet to a new active second routing component based on the second path information.
    Type: Application
    Filed: December 11, 2016
    Publication date: June 14, 2018
    Applicant: NICIRA, INC.
    Inventors: Liwen WU, Jia YU, Xinhua HONG, Ronghua ZHANG, David LEROY
  • Publication number: 20180164885
    Abstract: Systems and methods for compliance illusions with haptics are disclosed. One illustrative system described herein includes a user interface device including: a sensor configured to detect a gesture; a haptic output device configured to output haptic effects; and a processor coupled to the sensor and the haptic output device, the processor configured to: receive a sensor signal from the sensor; determine a user interaction in mixed reality; determine a haptic effect based in part on the sensor signal and the user interaction; and transmit a haptic signal associated with the haptic effect to the haptic output device.
    Type: Application
    Filed: December 9, 2016
    Publication date: June 14, 2018
    Inventors: Juan Manuel Cruz-Hernandez, Liwen Wu, Neil T. Olien, Danny A Grant, Jamal Saboune
  • Publication number: 20180138831
    Abstract: Examples of devices, systems, and methods of controlling one or more contact conditions of an insulated static electrostatic force electrode are disclosed. One example device has an insulated static electrostatic force electrode and a flexible suspension attached to insulated the static electrostatic force electrode. In examples, the flexible suspension controls a contact condition of the static electrostatic force electrode to alter the static electrostatic force feedback provided by the insulated static electrostatic force electrode.
    Type: Application
    Filed: November 17, 2016
    Publication date: May 17, 2018
    Inventors: Vincent Levesque, Mansoor Alghooneh, Jamal Saboune, Vahid Khoshkava, Mohammadreza Motamedi, Danny A. Grant, Juan Manuel Cruz-Hernandez, Liwen Wu
  • Publication number: 20180130320
    Abstract: One illustrative system disclosed herein includes an enclosure configured to define a boundary of a chamber, the chamber including a material, and a flexible layer coupled overtop of the chamber and configured to enclose the chamber. The illustrative system also includes a first actuation device configured to receive a first haptic signal and responsively output a first haptic effect by changing a characteristic of the material to deform the flexible layer. The illustrative system also includes a second actuation device configured to receive a second haptic signal and responsively output a second haptic effect by applying an electrical signal to the flexible layer. The illustrative system further includes a processor in communication with the first actuation device and the second actuation device. The processor is configured to transmit the first haptic signal to the first actuation device and the second haptic signal to the second actuation device.
    Type: Application
    Filed: January 9, 2018
    Publication date: May 10, 2018
    Applicant: Immersion Corporation
    Inventors: Vahid Khoshkava, Vincent Levesque, Jamal Saboune, Abdelwahab Hamam, Juan Manuel Cruz-Hernandez, Liwen Wu
  • Publication number: 20180122197
    Abstract: A method or system that receives input media including at least video data in which a video event within the video data is detected. Related data that is associated with the detected video event is collected and one or more feature parameters are configured based on the collected related data. The type of video event is determining and a set of feature parameters is selected based on the type of video event. A haptic effect is then automatically generated based on the selected set of feature parameters.
    Type: Application
    Filed: October 31, 2016
    Publication date: May 3, 2018
    Inventor: Liwen WU
  • Publication number: 20180097734
    Abstract: Some embodiments provide a method for managing traffic in a virtualized environment. The method, in some embodiments, configures multiple edge service gateways (ESGs) executing on multiple host machines (e.g., on a hypervisor) to use a same anycast inner interne protocol (IP) address and a same anycast inner media access control (MAC) address. In some embodiments, ESGs of a logical network facilitate communication between machines connected to the logical network and machines on external networks. In some embodiments, the method configures a set of virtual extensible local area network tunnel endpoints (VTEPs) connected to an ESG to use a same anycast VTEP IP address. The method, in some embodiments, configures a distributed logical router (DLR or DR) to send data packets with destinations outside the logical network from sources belonging to the logical network to the anycast VTEP IP address.
    Type: Application
    Filed: February 28, 2017
    Publication date: April 5, 2018
    Inventors: Sami Boutros, Anirban Sengupta, Sreeram Ravinoothala, Liwen Wu
  • Patent number: 9898903
    Abstract: One illustrative system disclosed herein includes an enclosure configured to define a boundary of a chamber, the chamber including a material, and a flexible layer coupled overtop of the chamber and configured to enclose the chamber. The illustrative system also includes a first actuation device configured to receive a first haptic signal and responsively output a first haptic effect by changing a characteristic of the material to deform the flexible layer. The illustrative system also includes a second actuation device configured to receive a second haptic signal and responsively output a second haptic effect by applying an electrical signal to the flexible layer. The illustrative system further includes a processor in communication with the first actuation device and the second actuation device. The processor is configured to transmit the first haptic signal to the first actuation device and the second haptic signal to the second actuation device.
    Type: Grant
    Filed: March 7, 2016
    Date of Patent: February 20, 2018
    Assignee: Immersion Corporation
    Inventors: Vahid Khoshkava, Vincent Levesque, Jamal Saboune, Abdelwahab Hamam, Juan Manuel Cruz-Hernandez, Liwen Wu
  • Publication number: 20180011538
    Abstract: Embodiments generate haptic effects in response to a user input (e.g., pressure based or other gesture). Embodiments receive a first input range corresponding to user input and receive a haptic profile corresponding to the first input range. During a first dynamic portion of the haptic profile, embodiments generate a dynamic haptic effect that varies based on values of the first input range during the first dynamic portion. Further, at a first trigger position of the haptic profile, embodiments generate a triggered haptic effect.
    Type: Application
    Filed: July 7, 2017
    Publication date: January 11, 2018
    Inventors: WILLIAM S. RIHN, SANYA ATTARI, LIWEN WU, MIN LEE, DAVID BIRNBAUM