Patents by Inventor Venkata Naga Vijaya Swetha Machanavajhala

Venkata Naga Vijaya Swetha Machanavajhala has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230343330
    Abstract: The techniques disclosed herein provide intelligent display of auditory world experiences. Specialized Al models are configured to display integrated visualizations for different aspects of the auditory signals that may be communicated during an event, such as a meeting, chat session, etc. For instance, a system can use a sentiment recognition model to identify specific characteristics of a speech input, such as volume or tone, provided by a participant. The system can also use a speech recognition model to identify keywords that can be used to distinguish portions of a transcript that are displayed. The system can also utilize an audio recognition model that is configured to analyze non-speech audio sounds for the purposes of identifying non-speech events. The system can then integrate the user interface attributes, distinguished portions of the transcript, and visual indicators describing the non-speech events.
    Type: Application
    Filed: April 21, 2022
    Publication date: October 26, 2023
    Inventors: Venkata Naga Vijaya Swetha MACHANAVAJHALA, Ryan Graham WILLIAMS, Sanghee OH, Ikuyo TSUNODA, William D. LEWIS, Jian WU, Daniel Charles TOMPKINS
  • Patent number: 10313845
    Abstract: Non-limiting examples of the present disclosure describe proactive speech detection on behalf of a user and alerting the user when a specific word, name, etc. is detected. Speech detection is actively executed through a computing device, where the speech detection analyzes spoken utterances in association with a dynamic grammar file stored locally on the computing device. An alert is generated that indicates when a spoken word of the utterances matches a word stored in the dynamic grammar file. The alert may be displayed, for example, through the computing device. The alert provides indication that the spoken word is identified in the spoken utterances. In further examples, a buffered window of the spoken utterances is captured which is associated with a detection of the specific word. A live transcription of the content in the buffered window is generated and provided to a computing device of the user.
    Type: Grant
    Filed: June 6, 2017
    Date of Patent: June 4, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Anirudh Koul, Venkata Naga Vijaya Swetha Machanavajhala, Stéphane Morichère-Matte, David Dai Wei Zhang, Anass Al-Wohoush, Jan Ervin Mickel Takata Clarin, Sheng-Ting Lin, Sitthinon Jinaphant, Shery Sharonjit Sumal
  • Publication number: 20180352390
    Abstract: Non-limiting examples of the present disclosure describe proactive speech detection on behalf of a user and alerting the user when a specific word, name, etc. is detected. Speech detection is actively executed through a computing device, where the speech detection analyzes spoken utterances in association with a dynamic grammar file stored locally on the computing device. An alert is generated that indicates when a spoken word of the utterances matches a word stored in the dynamic grammar file. The alert may be displayed, for example, through the computing device. The alert provides indication that the spoken word is identified in the spoken utterances. In further examples, a buffered window of the spoken utterances is captured which is associated with a detection of the specific word. A live transcription of the content in the buffered window is generated and provided to a computing device of the user.
    Type: Application
    Filed: June 6, 2017
    Publication date: December 6, 2018
    Inventors: Anirudh Koul, Venkata Naga Vijaya Swetha Machanavajhala, Stéphane Morichère-Matte, David Dai Wei Zhang, Anass Al-Wohoush, Jan Ervin Mickel Takata Clarin, Sheng-Ting Lin, Sitthinon Jinaphant, Shery Sharonjit Sumal