Patents by Inventor Anurag Bist

Anurag Bist has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11064257
    Abstract: In the present embodiment, a system and a method for tagging a content based on individual cues, emotional score or emotional profile is provided, where the content is a video file, a webpage, a mobile application, a product review or product demo video. The method involves authorizing a user to access the content; capturing a user specific data, an application details and a content specific data in response to the content in real-time; analyzing the captured user specific data, the application detail and the content specific data to generate a user emotional profile; and tagging the user emotional profile with the content in a time granular manner.
    Type: Grant
    Filed: March 19, 2020
    Date of Patent: July 13, 2021
    Assignee: Monet Networks, Inc.
    Inventors: Anurag Bist, Ramon Solves Pujol, Eric Leopold Frankel
  • Publication number: 20200288206
    Abstract: In the present embodiment, a system and a method for tagging a content based on individual cues, emotional score or emotional profile is provided, where the content is a video file, a webpage, a mobile application, a product review or product demo video. The method involves authorizing a user to access the content; capturing a user specific data, an application details and a content specific data in response to the content in real-time; analyzing the captured user specific data, the application detail and the content specific data to generate a user emotional profile; and tagging the user emotional profile with the content in a time granular manner.
    Type: Application
    Filed: March 19, 2020
    Publication date: September 10, 2020
    Inventors: Anurag Bist, Ramon Solves Pujol, Eric Leopold Frankel
  • Patent number: 10638197
    Abstract: A system and method for media content evaluation based on combining multi-modal inputs from the audiences that may include reactions and emotions that are recorded in real-time on a frame-by-frame basis as the participants are watching the media content is provided. The real time reactions and emotions are recorded in two different campaigns with two different sets of people and which include different participants for each. For the first set of participants facial expression are captured and for the second set of participants reactions are captured. The facial expression analysis and reaction analysis of both set of participants are correlated to identify the segments which are engaging and interesting to all the participants.
    Type: Grant
    Filed: November 21, 2018
    Date of Patent: April 28, 2020
    Assignee: Monet Networks, Inc.
    Inventors: Anurag Bist, Ramon Solves Pujol, Eric Leopold Frankel
  • Publication number: 20190364089
    Abstract: A system and a method for generating an emotional profile of the user and deriving inference from the analytics of generated emotional profile is provided. The method involves sharing media content or online event in a connected environment; capturing user's reaction to the said content or event; generating an emotional profile of the user to rate the media content or event; and sharing the emotional profile within the connected environment.
    Type: Application
    Filed: November 28, 2018
    Publication date: November 28, 2019
    Inventor: Anurag Bist
  • Publication number: 20190297384
    Abstract: A system and method for media content evaluation based on combining multi-modal inputs from the audiences that may include reactions and emotions that are recorded in real-time on a frame-by-frame basis as the participants are watching the media content is provided. The real time reactions and emotions are recorded in two different campaigns with two different sets of people and which include different participants for each. For the first set of participants facial expression are captured and for the second set of participants reactions are captured. The facial expression analysis and reaction analysis of both set of participants are correlated to identify the segments which are engaging and interesting to all the participants.
    Type: Application
    Filed: November 21, 2018
    Publication date: September 26, 2019
    Inventors: Anurag Bist, Ramon Solves Pujol, Eric Leopold Frankel
  • Publication number: 20190213909
    Abstract: A system and a method for capturing and analyzing the non-verbal and behavioral cues of the users in a network is provided. The sensors present in the client device capture the user behavioral and sensory cues as a reaction to the event, or a particular content. The client device then processes these sensory or behavior inputs or sends these captured sensory and behavioral inputs to the analysis module present in the server. The analysis module runs through a single or multiple sensory inputs on a per capture basis and derives analytics for the particular event it corresponds to. The analytics module consists of a Classification engine that first segments the initial captured cues into Intermediate States. Subsequent to this there is a Decision Engine that aggregates these Intermediate States from multiple instances of users and events, and other information about the user and the event to arrive at a Final State corresponding to the user reaction to the event.
    Type: Application
    Filed: December 5, 2018
    Publication date: July 11, 2019
    Inventor: Anurag Bist
  • Publication number: 20170251262
    Abstract: A system and method for media content evaluation based on combining multi-modal inputs from the audiences that may include reactions and emotions that are recorded in real-time on a frame-by-frame basis as the participants are watching the media content is provided. The real time reactions and emotions are recorded in two different campaigns with two different sets of people and which include different participants for each. For the first set of participants facial expression are captured and for the second set of participants reactions are captured. The facial expression analysis and reaction analysis of both set of participants are correlated to identify the segments which are engaging and interesting to all the participants.
    Type: Application
    Filed: May 15, 2017
    Publication date: August 31, 2017
    Inventors: Anurag Bist, Ramon Solves Pujol, Eric Leopold Frankel
  • Publication number: 20160241533
    Abstract: A system and a method for tagging content based on individual cues, emotional score or emotional profile is provided, where the content is a video file, a webpage, a mobile application, a product review or product demo video is provided. The method involves authorizing a user to access the content; capturing a user specific data, an application details and a content specific data in response to the content in real-time; analyzing the captured user specific data, the application detail and the content specific data to generate a user emotional profile; and tagging the user emotional profile with the content in a time granular manner.
    Type: Application
    Filed: November 16, 2015
    Publication date: August 18, 2016
    Inventors: Anurag Bist, Hamid Lalani
  • Patent number: 9202251
    Abstract: A system and a method for generating an emotional profile of the user and deriving inference from the analytics of generated emotional profile is provided. The method involves sharing media content or online event in a connected environment; capturing user's reaction to the said content or event; generating an emotional score of the user to rate the media content or event; and sharing the emotional score within the connected environment.
    Type: Grant
    Filed: November 7, 2011
    Date of Patent: December 1, 2015
    Inventors: Anurag Bist, Hamid Lalani
  • Patent number: 9026476
    Abstract: A system and a method for generating an emotional profile of the user and deriving inference from the analytics of generated emotional profile is provided. The method involves sharing media content or online event in a connected environment; capturing user's reaction to the said content or event; generating an emotional score of the user to rate the media content or event; and sharing the emotional score within the connected environment.
    Type: Grant
    Filed: November 7, 2011
    Date of Patent: May 5, 2015
    Inventor: Anurag Bist
  • Publication number: 20140324808
    Abstract: A new method for semantic segmentation and tagging of a patent or a technical document is provided. The semantic tags are used for search and display of patents. The semantic tagging method involves creating automatic tags for preamble, elements, and sub-elements, and their respective attributes and relationships in patent claims. The tags are used in patent search to improve search performance. The tags are used in a novel user interface for viewing and analyzing one or more patents. The user interface provides a unique method to display different tags of a patent, which provides critical information towards comprehending the patent, and helps create better search queries related to the patent.
    Type: Application
    Filed: March 17, 2014
    Publication date: October 30, 2014
    Inventors: Sumeet Sandhu, Anurag Bist
  • Publication number: 20130342433
    Abstract: Extended operation of battery-powered devices including a visual display such as an LCD screen in a cell phone or a personal media player depends on low power consumption of the display device. For saving display power, dynamic backlight control can be used, involving adjustment of backlight brightness combined with transformation of video data to be displayed. When displaying a video or movie, in the interest of minimizing perceived flicker, dynamic changes in backlight brightness can be limited to coincide with scene changes. Video scene changes can be determined prior to their ultimate use in a client device, and available scene-change information can be downloaded along with the video to the client device. Alternatively, scene-change information as determined on the client device or elsewhere can be stored on the client device for later use during actual video display.
    Type: Application
    Filed: February 4, 2011
    Publication date: December 26, 2013
    Inventors: Ananth Sankar, Anurag Bist
  • Publication number: 20130288212
    Abstract: A system and a method for capturing and analyzing the non-verbal and behavioral cues of the users in a network is provided. The sensors present in the client device capture the user behavioral and sensory cues and analysis is performed to derive analytics for the particular event it corresponds to.
    Type: Application
    Filed: March 8, 2013
    Publication date: October 31, 2013
    Inventor: Anurag Bist
  • Publication number: 20130117375
    Abstract: A system and a method for generating an emotional profile of the user and deriving inference from the analytics of generated emotional profile is provided. The method involves sharing media content or online event in a connected environment; capturing user's reaction to the said content or event; generating an emotional score of the user to rate the media content or event; and sharing the emotional score within the connected environment.
    Type: Application
    Filed: November 7, 2011
    Publication date: May 9, 2013
    Inventors: Anurag Bist, Hamid Lalani
  • Publication number: 20120290508
    Abstract: A system and a method for generating an emotional profile of the user and deriving inference from the analytics of generated emotional profile is provided. The method involves sharing media content or online event in a connected environment; capturing user's reaction to the said content or event; generating an emotional score of the user to rate the media content or event; and sharing the emotional score within the connected environment.
    Type: Application
    Filed: November 7, 2011
    Publication date: November 15, 2012
    Inventor: Anurag Bist
  • Publication number: 20120265811
    Abstract: A system and a method for generating an emotional profile of the user and deriving inference from the analytics of generated emotional profile is provided. The method involves sharing media content or online event in a connected environment; capturing user's reaction to the said content or event; generating an emotional profile of the user to rate the media content or event; and sharing the emotional profile within the connected environment.
    Type: Application
    Filed: November 7, 2011
    Publication date: October 18, 2012
    Inventor: Anurag Bist
  • Publication number: 20120200484
    Abstract: Extended operation of battery-powered devices including a visual display such as an LCD screen in a cell phone or a personal media player depends on low power consumption of the display device. For saving display power, dynamic backlight control can be used, involving adjustment of backlight brightness combined with transformation of video data to be displayed. When displaying a video or movie, in the interest of minimizing perceived flicker, dynamic changes in backlight brightness can be limited to coincide with scene changes. Video scene changes can be determined prior to their ultimate use in a client device, and available scene-change information can be downloaded along with the video to the client device. Alternatively, scene-change information as determined on the client device or elsewhere can be stored on the client device for later use during actual video display.
    Type: Application
    Filed: February 4, 2011
    Publication date: August 9, 2012
    Inventors: Ananth Sankar, Anurag Bist
  • Patent number: 7873229
    Abstract: In visual display devices such as LCD devices with backlight illumination, the backlight typically consumes most of device battery power. In the interest of displaying a given pixel pattern at a minimized backlight level, the pattern can be transformed while maintaining image quality, with a transform determined from pixel luminance statistics. Aside from, or in addition to being used for such minimizing, a transform also can be used for image enhancement, for a displayed image better to meet a visual perception quality. In either case, the transform preferably is constrained for enforcing one or several display attributes. In a network setting, the technique can be implemented in distributed fashion, so that subtasks of the technique are performed by different, interconnected processors such as server, client and proxy processors.
    Type: Grant
    Filed: July 31, 2006
    Date of Patent: January 18, 2011
    Assignee: Moxair, Inc.
    Inventors: Ananth Sankar, David Romacho Rosell, Anurag Bist
  • Patent number: 7692612
    Abstract: In visual display devices such as LCD devices with backlight illumination, the backlight typically consumes most of device battery power. In the interest of displaying a given pixel pattern at a minimized backlight level, the pattern can be transformed while maintaining image quality, with a transform determined from pixel luminance statistics. Aside from, or in addition to such minimizing, a transform also can be used for image enhancement, for a displayed image better to meet a visual perception quality. In either case, the transform preferably is constrained for enforcing one or several display attributes.
    Type: Grant
    Filed: June 20, 2006
    Date of Patent: April 6, 2010
    Assignee: Moxair, Inc.
    Inventors: Ananth Sankar, David Romacho Rosell, Anurag Bist, Praveen Dua, Sriram Sundararajan
  • Patent number: 7627111
    Abstract: An embodiment of the present invention includes an adaptive predictor, a system white noise generator, and a background noise estimator. The adaptive predictor estimates adaptive weights of autoregressive (AR) model of background noise as background samples in an echo canceler. The adaptive predictor generates adaptive error. The system white noise generator generates a white noise using the adaptive error. The noise background estimator estimates the background noise using the white noise and the estimated adaptive weights.
    Type: Grant
    Filed: November 25, 2002
    Date of Patent: December 1, 2009
    Assignee: Intel Corporation
    Inventors: Neil J. Bershad, Anurag Bist, Stan Hsieh