Patents by Inventor Abhishek Abhishek
Abhishek Abhishek has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230386477Abstract: A system and a method are disclosed for identifying a subjectively interesting moment in a transcript. In an embodiment, a device receives a transcription of a conversation, and identifies a participant of the conversation. The device accesses a machine learning model corresponding to the participant, and applies, as input to the machine learning model, the transcription. The device receives as output from the machine learning model a portion of the transcription having relevance to the participant, and generates for display, to the participant, information pertaining to the portion.Type: ApplicationFiled: August 12, 2023Publication date: November 30, 2023Inventors: Krishnamohan Reddy Nareddy, Abhishek Abhishek, Rohit Ganpat Mane, Rajiv Garg
-
Publication number: 20230386465Abstract: Described herein is a system for automatically detecting and assigning action items in a real-time conversation and determining whether such action items have been completed. The system detects, during a meeting, a plurality of action items and an utterance that corresponds to a completed action item. Responsive to detecting the utterance, the system generates a similarity score with respect to a first action item of the plurality of action items. The system compares the similarity score to a first threshold. Responsive to determining that the similarity score does not exceed the first threshold, the system generates a second similarity score with respect to a second action item of the plurality of action items. The system compares the second similarity score to a second threshold, which exceeds the first threshold. Responsive to determining that the second similarity score exceeds the second threshold, the system marks the second action item as completed.Type: ApplicationFiled: August 12, 2023Publication date: November 30, 2023Inventors: Rohit Ganpat Mane, Abhishek Abhishek, Krishnamohan Reddy Nareddy, Rajiv Garg
-
Patent number: 11783829Abstract: Described herein is a system for automatically detecting and assigning action items in a real-time conversation and determining whether such action items have been completed. The system detects, during a meeting, a plurality of action items and an utterance that corresponds to a completed action item. Responsive to detecting the utterance, the system generates a similarity score with respect to a first action item of the plurality of action items. The system compares the similarity score to a first threshold. Responsive to determining that the similarity score does not exceed the first threshold, the system generates a second similarity score with respect to a second action item of the plurality of action items. The system compares the second similarity score to a second threshold, which exceeds the first threshold. Responsive to determining that the second similarity score exceeds the second threshold, the system marks the second action item as completed.Type: GrantFiled: April 29, 2021Date of Patent: October 10, 2023Assignee: Outreach CorporationInventors: Rohit Ganpat Mane, Abhishek Abhishek, Krishnamohan Reddy Nareddy, Rajiv Garg
-
Patent number: 11763823Abstract: A system and a method are disclosed for identifying a subjectively interesting moment in a transcript. In an embodiment, a device receives a transcription of a conversation, and identifies a participant of the conversation. The device accesses a machine learning model corresponding to the participant, and applies, as input to the machine learning model, the transcription. The device receives as output from the machine learning model a portion of the transcription having relevance to the participant, and generates for display, to the participant, information pertaining to the portion.Type: GrantFiled: February 18, 2021Date of Patent: September 19, 2023Assignee: Outreach CorporationInventors: Krishnamohan Reddy Nareddy, Abhishek Abhishek, Rohit Ganpat Mane, Rajiv Garg
-
Publication number: 20220373346Abstract: According to examples, a system for providing personalized routes may include a processor and a memory storing instructions. The processor, when executing the instructions, may cause the system to receive a user request for a personalized route wherein the user request can include at least a route origin and destination. The processor employs the user request to identify the user and retrieves the user's preferences from different recommender systems. The recommendations output by the recommender systems are ranked based on context data associated with the user request. The geographic locations associated with a predetermined number of top-ranked recommendations are obtained and a personalized route that passes through a maximum number of the geographic locations is built for a display to the user.Type: ApplicationFiled: May 5, 2022Publication date: November 24, 2022Applicant: Meta Platforms Technologies, LLCInventors: Abhishek ABHISHEK, Andrew James Forchione
-
Patent number: 11244165Abstract: The discussion relates to context-aware environments. One example can include inwardly-facing cameras positioned around a periphery of an environment that defines a volume. The example can also include sensors positioned relative to the volume and configured to communicate with a user device in the volume. The example can also include an ambient perception component configured to track user locations in the volume and to detect user gestures relative to objects in the volume, and responsive to receiving a query from the user's device, to supplement the query with information derived from the objects.Type: GrantFiled: June 30, 2020Date of Patent: February 8, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Abhishek Abhishek, Khawar Zuberi, Jie Liu, Rouzbeh Aminpour, Zhengyou Zhang, Ali Dalloul, Dimitrios Lymberopoulos, Michel Goraczko, Tony Capone, Di Wang, Yi Lu, Yasser B. Asmi
-
Publication number: 20210343282Abstract: Described herein is a system for automatically detecting and assigning action items in a real-time conversation and determining whether such action items have been completed. The system detects, during a meeting, a plurality of action items and an utterance that corresponds to a completed action item. Responsive to detecting the utterance, the system generates a similarity score with respect to a first action item of the plurality of action items. The system compares the similarity score to a first threshold. Responsive to determining that the similarity score does not exceed the first threshold, the system generates a second similarity score with respect to a second action item of the plurality of action items. The system compares the second similarity score to a second threshold, which exceeds the first threshold. Responsive to determining that the second similarity score exceeds the second threshold, the system marks the second action item as completed.Type: ApplicationFiled: April 29, 2021Publication date: November 4, 2021Inventors: Rohit Ganpat Mane, Abhishek Abhishek, Krishnamohan Reddy Nareddy, Rajiv Garg
-
Publication number: 20210287683Abstract: A system and a method are disclosed for identifying a subjectively interesting moment in a transcript. In an embodiment, a device receives a transcription of a conversation, and identifies a participant of the conversation. The device accesses a machine learning model corresponding to the participant, and applies, as input to the machine learning model, the transcription. The device receives as output from the machine learning model a portion of the transcription having relevance to the participant, and generates for display, to the participant, information pertaining to the portion.Type: ApplicationFiled: February 18, 2021Publication date: September 16, 2021Inventors: Krishnamohan Reddy Nareddy, Abhishek Abhishek, Rohit Ganpat Mane, Rajiv Garg
-
Patent number: 10997420Abstract: The discussion relates to context-aware environments. One example can include inwardly-facing cameras positioned around a periphery of an environment that defines a volume. The example can also include sensors positioned relative to the volume and configured to communicate with a user device in the volume. The example can also include an ambient perception component configured to track user locations in the volume and to detect user gestures relative to objects in the volume, and responsive to receiving a query from the user's device, to supplement the query with information derived from the objects.Type: GrantFiled: June 22, 2020Date of Patent: May 4, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Abhishek Abhishek, Khawar Zuberi, Jie Liu, Rouzbeh Aminpour, Zhengyou Zhang, Ali Dalloul, Dimitrios Lymberopoulos, Michel Goraczko, Tony Capone, Di Wang, Yi Lu, Yasser B. Asmi
-
Publication number: 20210042689Abstract: The discussion relates to inventory control. In one example, a set of ID sensors can be employed in an inventory control environment and subsets of the ID sensors can collectively sense tagged items in shared space. Data from the subset of ID sensors can indicate when a user has taken possession of an individual tagged item in the shared space.Type: ApplicationFiled: October 23, 2020Publication date: February 11, 2021Applicant: Microsoft Technology Licensing, LLCInventors: Khawar ZUBERI, Abhishek ABHISHEK, Rouzbeh AMINPOUR, Yasser B. ASMI, Zhengyou ZHANG
-
Patent number: 10902376Abstract: The discussion relates to inventory control. In one example, a set of ID sensors can be employed in an inventory control environment and subsets of the ID sensors can collectively sense tagged items in shared space. Data from the subset of ID sensors can indicate when a user has taken possession of an individual tagged item in the shared space.Type: GrantFiled: November 6, 2017Date of Patent: January 26, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Khawar Zuberi, Abhishek Abhishek, Rouzbeh Aminpour, Yasser B. Asmi, Zhengyou Zhang
-
Publication number: 20200334462Abstract: The discussion relates to context-aware environments. One example can include inwardly-facing cameras positioned around a periphery of an environment that defines a volume. The example can also include sensors positioned relative to the volume and configured to communicate with a user device in the volume. The example can also include an ambient perception component configured to track user locations in the volume and to detect user gestures relative to objects in the volume, and responsive to receiving a query from the user's device, to supplement the query with information derived from the objects.Type: ApplicationFiled: June 30, 2020Publication date: October 22, 2020Applicant: Microsoft Technology Licensing, LLCInventors: Abhishek ABHISHEK, Khawar ZUBERI, Jie LIU, Rouzbeh AMINPOUR, Zhengyou ZHANG, Ali DALLOUL, Dimitrios LYMBEROPOULOS, Michel GORACZKO, Tony CAPONE, Di WANG, Yi LU, Yasser B. ASMI
-
Publication number: 20200320301Abstract: The discussion relates to context-aware environments. One example can include inwardly-facing cameras positioned around a periphery of an environment that defines a volume. The example can also include sensors positioned relative to the volume and configured to communicate with a user device in the volume. The example can also include an ambient perception component configured to track user locations in the volume and to detect user gestures relative to objects in the volume, and responsive to receiving a query from the user's device, to supplement the query with information derived from the objects.Type: ApplicationFiled: June 22, 2020Publication date: October 8, 2020Applicant: Microsoft Technology Licensing, LLCInventors: Abhishek ABHISHEK, Khawar ZUBERI, Jie LIU, Rouzbeh AMINPOUR, Zhengyou ZHANG, Ali DALLOUL, Dimitrios LYMBEROPOULOS, Michel GORACZKO, Tony CAPONE, Di WANG, Yi LU, Yasser B. ASMI
-
Patent number: 10748002Abstract: The discussion relates to context-aware environments. One example can include inwardly-facing cameras positioned around a periphery of an environment that defines a volume. The example can also include sensors positioned relative to the volume and configured to communicate with a user device in the volume. The example can also include an ambient perception component configured to track user locations in the volume and to detect user gestures relative to objects in the volume, and responsive to receiving a query from the user's device, to supplement the query with information derived from the objects.Type: GrantFiled: April 27, 2018Date of Patent: August 18, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Abhishek Abhishek, Khawar Zuberi, Jie Liu, Rouzbeh Aminpour, Zhengyou Zhang, Ali Dalloul, Dimitrios Lymberopoulos, Michel Goraczko, Tony Capone, Di Wang, Yi Lu, Yasser B. Asmi
-
Patent number: 10748001Abstract: The discussion relates to context-aware environments. One example can include inwardly-facing cameras positioned around a periphery of an environment that defines a volume. The example can also include sensors positioned relative to the volume and configured to communicate with a user device in the volume. The example can also include an ambient perception component configured to track user locations in the volume and to detect user gestures relative to objects in the volume, and responsive to receiving a query from the user's device, to supplement the query with information derived from the objects.Type: GrantFiled: April 27, 2018Date of Patent: August 18, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Abhishek Abhishek, Khawar Zuberi, Jie Liu, Rouzbeh Aminpour, Zhengyou Zhang, Ali Dalloul, Dimitrios Lymberopoulos, Michel Goraczko, Tony Capone, Di Wang, Yi Lu, Yasser B. Asmi
-
Publication number: 20190332864Abstract: The discussion relates to context-aware environments. One example can include inwardly-facing cameras positioned around a periphery of an environment that defines a volume. The example can also include sensors positioned relative to the volume and configured to communicate with a user device in the volume. The example can also include an ambient perception component configured to track user locations in the volume and to detect user gestures relative to objects in the volume, and responsive to receiving a query from the user's device, to supplement the query with information derived from the objects.Type: ApplicationFiled: April 27, 2018Publication date: October 31, 2019Applicant: Microsoft Technology Licensing, LLCInventors: Abhishek ABHISHEK, Khawar ZUBERI, Jie LIU, Rouzbeh AMINPOUR, Zhengyou ZHANG, Ali DALLOUL, Dimitrios LYMBEROPOULOS, Michel GORACZKO, Tony CAPONE, Di WANG, Yi LU, Yasser B. ASMI
-
Publication number: 20190332863Abstract: The discussion relates to context-aware environments. One example can include inwardly-facing cameras positioned around a periphery of an environment that defines a volume. The example can also include sensors positioned relative to the volume and configured to communicate with a user device in the volume. The example can also include an ambient perception component configured to track user locations in the volume and to detect user gestures relative to objects in the volume, and responsive to receiving a query from the user's device, to supplement the query with information derived from the objects.Type: ApplicationFiled: April 27, 2018Publication date: October 31, 2019Applicant: Microsoft Technology Licensing, LLCInventors: Abhishek ABHISHEK, Khawar ZUBERI, Jie LIU, Rouzbeh AMINPOUR, Zhengyou ZHANG, Ali DALLOUL, Dimitrios LYMBEROPOULOS, Michel GORACZKO, Tony CAPONE, Di WANG, Yi LU, Yasser B. ASMI
-
Publication number: 20190244161Abstract: The discussion relates to inventory control. One example can analyze data from sensors to identify items and users in an inventory control environment. The example can detect co-location of an individual user and an individual item at a first location in the inventory control environment at a first time and at a second location in the inventory control environment at a second time.Type: ApplicationFiled: February 2, 2018Publication date: August 8, 2019Applicant: Microsoft Technology Licensing, LLCInventors: Abhishek ABHISHEK, Rouzbeh AMINPOUR, Yasser B. ASMI, Zhengyou ZHANG, Ali DALLOUL, Jie LIU, Di WANG, Dimitrios LYMBEROPOULOS, Michel GORACZKO, Yi LU, William THOMAS
-
Publication number: 20190138975Abstract: The discussion relates to inventory control. In one example, a set of ID sensors can be employed in an inventory control environment and subsets of the ID sensors can collectively sense tagged items in shared space. Data from the subset of ID sensors can indicate when a user has taken possession of an individual tagged item in the shared space.Type: ApplicationFiled: November 6, 2017Publication date: May 9, 2019Applicant: Microsoft Technology Licensing, LLCInventors: Khawar ZUBERI, Abhishek Abhishek, Rouzbeh Aminpour, Yasser B. Asmi, Zhengyou Zhang
-
Patent number: 10104415Abstract: A user device within a communication architecture, the user device comprising: an image capture device configured to determine image data for the creation of a video channel defining the shared scene; an intrinsic/extrinsic data determiner configured to determine intrinsic/extrinsic capture device data associated with the image capture device; and a video encoder configured to encode the image data and intrinsic/extrinsic capture device data within the video channel.Type: GrantFiled: January 21, 2015Date of Patent: October 16, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Ming-Chieh Lee, Mei-Hsuan Lu, Robert Aichner, Ryan S. Menezes, Abhishek Abhishek, Bofan Hsu, Ermin Kozica