Patents by Inventor Anbumani Subramanian

Anbumani Subramanian has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10440497
    Abstract: A mechanism is described for facilitating multi-modal dereverberation in far-field audio systems according to one embodiment. A method of embodiments, as described herein, includes performing geometry estimation of a geographical space based on visuals of the space received from one or more cameras of a computing device. The method may further include computing reverberation time based on the geometry estimation that is further based on the visuals, and computing and applying dereverberation based on the reverberation time.
    Type: Grant
    Filed: November 17, 2017
    Date of Patent: October 8, 2019
    Assignee: INTEL CORPORATION
    Inventors: Raghavendra Rao R, Przemyslaw Maziewski, Adam Kupryjanow, Anbumani Subramanian
  • Publication number: 20190028829
    Abstract: A mechanism is described for facilitating multi-modal dereverberation in far-field audio systems according to one embodiment. A method of embodiments, as described herein, includes performing geometry estimation of a geographical space based on visuals of the space received from one or more cameras of a computing device. The method may further include computing reverberation time based on the geometry estimation that is further based on the visuals, and computing and applying dereverberation based on the reverberation time.
    Type: Application
    Filed: November 17, 2017
    Publication date: January 24, 2019
    Applicant: Intel Corporation
    Inventors: Raghavendra Rao R, Przemyslaw Maziewski, Adam Kupryjanow, Anbumani Subramanian
  • Patent number: 9626564
    Abstract: Techniques are disclosed for improving perceived eye-contact in video communications on personal communication devices that have a camera positioned offset slightly away from a display screen, such as found in tablets, mobile phones, laptops, desktops ultrabooks, all-in-ones, and the like. A three-dimensional mesh, such as a point cloud, may be created from an image and depth information that is captured of the user. A viewing direction of is determined by assessing the three-dimensional mesh and the mesh is rotated to minimize the angle between the viewing direction and a line of sight between the user's eyes and the camera.
    Type: Grant
    Filed: November 17, 2014
    Date of Patent: April 18, 2017
    Assignee: INTEL Corporation
    Inventors: Raghavendra R. Rao, Anbumani Subramanian
  • Patent number: 9619018
    Abstract: In one example, a method for multimodal human-machine interaction includes sensing a body posture of a participant using a camera (605) and evaluating the body posture to determine a posture-based probability of communication modalities from the participant (610). The method further includes detecting control input through a communication modality from the participant to the multimedia device (615) and weighting the control input by the posture-based probability (620).
    Type: Grant
    Filed: May 23, 2011
    Date of Patent: April 11, 2017
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Ramadevi Vennelakanti, Anbumani Subramanian, Prasenjit Dey, Sriganesh Madhvanath, Dinesh Mandalapu
  • Publication number: 20170024017
    Abstract: Presented is method and system for processing a gesture performed by a user of an input device. The method comprises detecting the gesture and determining a distance of the input device from a predetermined location. A user command is then determined based on the detected gesture and the determined distance.
    Type: Application
    Filed: October 5, 2016
    Publication date: January 26, 2017
    Inventors: Rahul Ajmera, Anbumani Subramanian, Sriganesh Madhvanath
  • Patent number: 9477324
    Abstract: Presented is method and system for processing a gesture performed by a user of an input device. The method comprises detecting the gesture and determining a distance of the input device from a predetermined location. A user command is then determined based on the detected gesture and the determined distance.
    Type: Grant
    Filed: May 13, 2010
    Date of Patent: October 25, 2016
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Rahul Ajmera, Anbumani Subramanian, Sriganesh Madhvanath
  • Publication number: 20160142673
    Abstract: Techniques are disclosed for improving perceived eye-contact in video communications on personal communication devices that have a camera positioned offset slightly away from a display screen, such as found in tablets, mobile phones, laptops, desktops ultrabooks, all-in-ones, and the like. A three-dimensional mesh, such as a point cloud, may be created from an image and depth information that is captured of the user. A viewing direction of is determined by assessing the three-dimensional mesh and the mesh is rotated to minimize the angle between the viewing direction and a line of sight between the user's eyes and the camera.
    Type: Application
    Filed: November 17, 2014
    Publication date: May 19, 2016
    Applicant: Intel Corporation
    Inventors: Raghavendra R. Rao, Anbumani Subramanian
  • Patent number: 9171200
    Abstract: A method of identifying gestural interaction comprises detecting a user with an imaging device, detecting with the imaging device the depth value at the centroid of the user with respect to the imaging device, detecting with the imaging device the closest distance of the user with respect to the imaging device, and, with a processor, identifying the initiation of a gestural interaction based on the ratio of the closest distance and the depth value at the centroid of the user is above a predetermined threshold. A computer program product for identifying initiation and termination of gestural interaction within a gestural interaction system comprises a computer readable storage medium having computer usable program code embodied therewith, the computer usable program code comprising computer usable program code that identifies the initiation of a gestural interaction by a user depending on whether a virtual bubble around the user has been broken.
    Type: Grant
    Filed: March 4, 2011
    Date of Patent: October 27, 2015
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Anbumani Subramanian, Guruprasad Hedge, Sriganesh Madhvanath
  • Patent number: 9046991
    Abstract: A system and method for dynamically displaying structurally dissimilar thumbnail images generated from multiple pages of an electronic document based on a display size of a display device is disclosed. In one embodiment, one or more candidate images are selected based on a generated metric for dissimilarity for each one of multiple images associated with multiple pages of an electronic document, where the metric for dissimilarity for each one of the multiple images is generated by comparing each image of the multiple images with substantially previous images of the multiple images. Then, the selected one or more candidate images are dynamically displayed as structurally dissimilar thumbnail images based on a display size of a display device.
    Type: Grant
    Filed: January 21, 2010
    Date of Patent: June 2, 2015
    Assignee: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.
    Inventors: Anbumani Subramanian, Prasenjit Dey
  • Patent number: 8768006
    Abstract: Systems, methods, and machine readable and executable instructions are provided for hand gesture recognition. A method for hand gesture recognition can include detecting, with an image input device in communication with a computing device, movement of an object. A hand pose associated with the moving object is recognized and a response corresponding to the hand pose is initiated.
    Type: Grant
    Filed: December 23, 2010
    Date of Patent: July 1, 2014
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Anbumani Subramanian, Vinod Pathangay, Dinesh Mandalapu
  • Patent number: 8730157
    Abstract: Hand pose recognition comprises determining an initial hand pose estimate for a captured input hand pose and performing iterations based upon hand pose estimates and residues between such estimates and hand pose estimates. One or more control signals are generated based upon the hand pose recognition.
    Type: Grant
    Filed: January 7, 2011
    Date of Patent: May 20, 2014
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Yogesh Sankarasubramaniam, Krusheel Munnangi, Anbumani Subramanian
  • Publication number: 20140132505
    Abstract: In one example, a method for multimodal human-machine interaction includes sensing a body posture of a participant using a camera (605) and evaluating the body posture to determine a posture-based probability of communication modalities from the participant (610). The method further includes detecting control input through a communication modality from the participant to the multimedia device (615) and weighting the control input by the posture-based probability (620).
    Type: Application
    Filed: May 23, 2011
    Publication date: May 15, 2014
    Inventors: Ramadevi Vennelakanti, Anbumani Subramanian, Prasenjit Dey, Sriganesh Madhvanath, Dinesh Mandalapu
  • Publication number: 20130343611
    Abstract: A method of identifying gestural interaction comprises detecting a user with an imaging device, detecting with the imaging device the depth value at the centroid of the user with respect to the imaging device, detecting with the imaging device the closest distance of the user with respect to the imaging device, and, with a processor, identifying the initiation of a gestural interaction based on the ratio of the closest distance and the depth value at the centroid of the user is above a predetermined threshold. A computer program product for identifying initiation and termination of gestural interaction within a gestural interaction system comprises a computer readable storage medium having computer usable program code embodied therewith, the computer usable program code comprising computer usable program code that identifies the initiation of a gestural interaction by a user depending on whether a virtual bubble around the user has been broken.
    Type: Application
    Filed: March 4, 2011
    Publication date: December 26, 2013
    Applicant: Hewlett-Packard Development Company, L.P.
    Inventors: Anbumani Subramanian, Guruprasad Hedge, Sriganesh Madhvanath
  • Publication number: 20130268476
    Abstract: A system and method for classification of moving objects and user authoring of new object classes is disclosed. In one embodiment, in a method of classification of moving objects, a moving object is inputted. Then, an object descriptor and a motion descriptor are extracted from the inputted moving object. Multiple initial candidate library object descriptors are identified from an object library and a motion library using the extracted object descriptor and the extracted motion descriptor. An initial object class estimate is identified based on the identified multiple initial candidate library object descriptors. Then, an initial residue is computed based on the extracted object descriptor and the identified multiple initial candidate library object descriptors associated with the initial object class estimate. The object class estimates are iteratively identified and it is determined whether the object class estimates converge based on a stopping criterion.
    Type: Application
    Filed: December 24, 2010
    Publication date: October 10, 2013
    Applicant: Hewlett-Packard Development Company, L.P.
    Inventors: Yogesh Sankarasubramniam, Krusheel Munnangi, Anbumani Subramanian, Avinash Sharma, Serene Banerjee
  • Patent number: 8462110
    Abstract: Presented is apparatus for capturing user input by pointing at a surface using pointing means. The apparatus comprises: a range camera for producing a depth-image of the pointing means; and a processor. The processor is adapted to determine from the depth-image the position and orientation of a pointing axis of the pointing means; extrapolate from the position and orientation the point of intersection of the axis with the surface; and control an operation based on the location of the point of intersection.
    Type: Grant
    Filed: January 5, 2010
    Date of Patent: June 11, 2013
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Vinod Pathangay, Anbumani Subramanian
  • Publication number: 20120278729
    Abstract: Provided is a method of assigning user interaction controls. The method assigns, in a scenario where multiple co-present users are simultaneously providing user inputs to a computing device, a first level of user interaction controls related to an object on the computing device to a single user and a second level of user interaction controls related to the object to all co-present simultaneous users of the computing device.
    Type: Application
    Filed: April 27, 2012
    Publication date: November 1, 2012
    Inventors: Ramadevi Vennelakanti, Prasenjit Dey, Sriganesh Madhvanath, Anbumani Subramanian, Dinesh Mandalapu
  • Patent number: 8244062
    Abstract: An image processing method comprises analyzing an image of a portion of text, and detecting the inter-line spacing and the inter-word spacing across the area of the image. Based on the inter-line and inter-word spacings, a quadrilateral shape is derived which represents the deformation of the text image from an undistorted image. The image is modified to perform perspective correction based on the derived quadrilateral.
    Type: Grant
    Filed: September 22, 2008
    Date of Patent: August 14, 2012
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Prasenjit Dey, Anbumani Subramanian
  • Publication number: 20120197991
    Abstract: Intuitive interaction may be performed over a network. The interaction may include collection of feedback from participants, wherein the feedback is active, passive, or a combination of both. The feedback from the participants may be aggregated and the aggregated feedback can be provided to at least one participant or a non-participant.
    Type: Application
    Filed: December 28, 2011
    Publication date: August 2, 2012
    Applicant: Hewlett-Packard Development Company, L.P.
    Inventors: Srinivasan Ramani, Sriganesh Madhvanath, Anbumani Subramanian
  • Publication number: 20120119984
    Abstract: Hand pose recognition comprises determining an initial hand pose estimate for a captured input hand pose and performing iterations based upon hand pose estimates and residues between such estimates and hand pose estimates. One or more control signals are generated based upon the hand pose recognition.
    Type: Application
    Filed: January 7, 2011
    Publication date: May 17, 2012
    Inventors: Yogesh SANKARASUBRAMANIAM, Krusheel Munnangi, Anbumani Subramanian
  • Publication number: 20120093360
    Abstract: Systems, methods, and machine readable and executable instructions are provided for hand gesture recognition. A method for hand gesture recognition can include detecting, with an image input device in communication with a computing device, movement of an object. A hand pose associated with the moving object is recognized and a response corresponding to the hand pose is initiated.
    Type: Application
    Filed: December 23, 2010
    Publication date: April 19, 2012
    Inventors: Anbumani SUBRAMANIAN, Vinod Pathangay, Dinesh Mandalapu