Patents by Inventor ABILASH RAJARETHINAM

ABILASH RAJARETHINAM has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11776687
    Abstract: An electronic apparatus and method for medical examination of human body using haptics is provided. The electronic apparatus controls a first head-mounted display to render a 3D model of an anatomical portion of the body of a human subject. The rendered 3D model includes a region corresponding to defect portion in the anatomical portion. The electronic apparatus transmits a touch input to wearable sensor in contact with the anatomical portion. Such an input corresponds to a human touch on the region of the rendered 3D model. The electronic apparatus receives, based on the touch input, bio-signals associated with the defect portion via the wearable sensor. The bio-signals include physiological signals and somatic sensation information associated with the defect portion. As a response to the human touch, the electronic apparatus controls a wearable haptic device to generate a haptic feedback based on the received set of bio-signals.
    Type: Grant
    Filed: November 10, 2020
    Date of Patent: October 3, 2023
    Assignee: SONY GROUP CORPORATION
    Inventors: Ramu Ramachandran, Sankar Shanmugam, Abilash Rajarethinam
  • Patent number: 11575863
    Abstract: A projection system and method for depth-based projection of image-based content is provided. The projection system receives a sequence of image frames to be projected on a physical surface of a three-dimensional (3D) structure. The projection system controls a depth sensor to acquire depth data associated with the physical surface and feeds a first input including the acquired depth data and a first image frame of the received sequence of image frames to a neural network-based model. The projection system further receives a set of geometric correction values as a first output of the neural network-based model for the fed first input and modifies the first image frame based on the received set of geometric correction values. The projection system further controls the illumination circuitry to project the modified first image frame onto the physical surface.
    Type: Grant
    Filed: April 8, 2021
    Date of Patent: February 7, 2023
    Assignee: SONY GROUP CORPORATION
    Inventors: Sankar Shanmugam, Abilash Rajarethinam
  • Publication number: 20220329765
    Abstract: A projection system and method for depth-based projection of image-based content is provided. The projection system receives a sequence of image frames to be projected on a physical surface of a three-dimensional (3D) structure. The projection system controls a depth sensor to acquire depth data associated with the physical surface and feeds a first input including the acquired depth data and a first image frame of the received sequence of image frames to a neural network-based model. The projection system further receives a set of geometric correction values as a first output of the neural network-based model for the fed first input and modifies the first image frame based on the received set of geometric correction values. The projection system further controls the illumination circuitry to project the modified first image frame onto the physical surface.
    Type: Application
    Filed: April 8, 2021
    Publication date: October 13, 2022
    Inventors: SANKAR SHANMUGAM, ABILASH RAJARETHINAM
  • Publication number: 20220148723
    Abstract: An electronic apparatus and method for medical examination of human body using haptics is provided. The electronic apparatus controls a first head-mounted display to render a 3D model of an anatomical portion of the body of a human subject. The rendered 3D model includes a region corresponding to defect portion in the anatomical portion. The electronic apparatus transmits a touch input to wearable sensor in contact with the anatomical portion. Such an input corresponds to a human touch on the region of the rendered 3D model. The electronic apparatus receives, based on the touch input, bio-signals associated with the defect portion via the wearable sensor. The bio-signals include physiological signals and somatic sensation information associated with the defect portion. As a response to the human touch, the electronic apparatus controls a wearable haptic device to generate a haptic feedback based on the received set of bio-signals.
    Type: Application
    Filed: November 10, 2020
    Publication date: May 12, 2022
    Inventors: RAMU RAMACHANDRAN, SANKAR SHANMUGAM, ABILASH RAJARETHINAM