Patents by Inventor Prasad Narasimha

Prasad Narasimha has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11619927
    Abstract: Efficient and effective workspace condition analysis systems and methods are presented. In one embodiment, a method comprises: accessing information associated with an activity space, including information on a newly discovered previously unmodeled entity; analyzing the activity information, including activity information associated with the previously unmodeled entity; forwarding feedback on the results of the analysis, including analysis results for the updated modeled information; and utilizing the feedback in a coordinated path plan check process. In one exemplary implementation the coordinated path plan check process comprises: creating a solid/CAD model including updated modeled information; simulating an activity including the updated modeled information; generating a coordinated path plan for entities in the activity space; and testing the coordinated path plan. The coordinated path plan check process can be a success.
    Type: Grant
    Filed: November 5, 2018
    Date of Patent: April 4, 2023
    Assignee: Drishti Technologies, Inc.
    Inventors: Prasad Narasimha Akella, Krishnendu Chaudhury, Ananth Uggirala
  • Patent number: 11348033
    Abstract: A system comprises one or more observation stations. Each observation station of the one or more observation stations comprises a corresponding set of one or more sensors. Additionally, the system comprises one or more physical machines that implement a computation engine configured to receive first observation data from the one or more observation stations. The computation engine may use the first observation data to train a machine learning system. The computation engine may subsequently use the trained machine learning system to provide feedback regarding an additional instance of the observation subject. The computation engine outputs the feedback.
    Type: Grant
    Filed: July 21, 2017
    Date of Patent: May 31, 2022
    Assignee: SRI International
    Inventors: Prasad Narasimha Akella, Bhaskar Ramamurthy, Manish Kothari, John Peter Marcotullio
  • Patent number: 11328307
    Abstract: A computer-based system connected to a remote user device and a plurality of services. A data collection engine having an input adapted to receive a set of collection data from a set of services, the set of collection data comprising a set of brand data for comparison against a control set of data stored in a database and having a set of fields and via the digital communications network a query.
    Type: Grant
    Filed: February 24, 2016
    Date of Patent: May 10, 2022
    Assignee: OpSec Online, Ltd.
    Inventors: Krishna Venkatraman, Paul Garrett Gaudini, Joshua Hopping, Marcos Perreau Guimaraes, Prasad Narasimha Akella
  • Patent number: 11327638
    Abstract: Currently there is no product that allows an attractive graphical home or building-like virtual presence on the Internet to be created—that would allow all the typical interactions or information dissemination that one would like to have with individuals or businesses reflecting their actual presence in a geographical location. Having such a product allows virtual neighborhoods to be created both by geographical location or by interests or by nature of businesses or many other such classifications. Enabling active interaction with businesses through the graphical presence will be useful in both increasing the footfall for the businesses as well as reducing the cost of their backend operational expenses by automating the interactions with their customers.
    Type: Grant
    Filed: April 29, 2019
    Date of Patent: May 10, 2022
    Inventor: Prasad Narasimha Seshadri
  • Publication number: 20220046176
    Abstract: A method and system for automating the camera movement and to zoom in on the latest instructor writing by properly centering the latest writing in the camera view and zooming in to achieve good magnification as well as focus—enabling the writing to be seen clearly without requiring a greater than 100% zoom on the browser being used by the viewer either on a laptop, desktop, or a mobile device.
    Type: Application
    Filed: June 29, 2021
    Publication date: February 10, 2022
    Inventor: Prasad Narasimha Seshadri
  • Patent number: 11175650
    Abstract: The systems and methods provide an action recognition and analytics tool for use in manufacturing, health care services, shipping, retailing and other similar contexts. Machine learning action recognition can be utilized to determine cycles, processes, actions, sequences, objects and or the like in one or more sensor streams. The sensor streams can include, but are not limited to, one or more video sensor frames, thermal sensor frames, infrared sensor frames, and or three-dimensional depth frames. The analytics tool can provide for kitting products, including real time verification of packing or unpacking by action and image recognition.
    Type: Grant
    Filed: November 5, 2018
    Date of Patent: November 16, 2021
    Assignee: Drishti Technologies, Inc.
    Inventors: Prasad Narasimha Akella, Krishnendu Chaudhury
  • Patent number: 11054811
    Abstract: In various embodiments, a method includes receiving one or more sensor streams with an engine. The engine identifies one or more actions that are performed at first and second stations of a plurality of stations within the sensor stream(s). The received sensor stream(s) and identified one or more actions performed at the first and second stations are stored in a data structure. The identified one or more actions are mapped to the sensor stream(s). The engine characterizes each of the identified one or more actions performed at each of the first and second stations to produce determined characterizations thereof. Based on one or more of the determined characterizations, automatically producing a recommendation, either dynamically or post-facto, to move at least one of the identified one or more actions performed at one of the stations to another station to reduce cycle time.
    Type: Grant
    Filed: November 5, 2018
    Date of Patent: July 6, 2021
    Assignee: Drishti Technologies, Inc.
    Inventors: Prasad Narasimha Akella, Krishnendu Chaudhury, Sameer Gupta, Ananth Uggirala
  • Patent number: 10890898
    Abstract: The systems and methods provide an action recognition and analytics tool for use in manufacturing, health care services, shipping, retailing, restaurants and other similar contexts. Machine learning action recognition can be utilized to determine cycles, processes, actions, sequences, objects and or the like in one or more sensor streams. The sensor streams can include, but are not limited to, one or more video sensor frames, thermal sensor frames, infrared sensor frames, and or three-dimensional depth frames. The analytics tool can provide for establishing traceability.
    Type: Grant
    Filed: November 5, 2018
    Date of Patent: January 12, 2021
    Assignee: Drishti Technologies, Inc.
    Inventors: Prasad Narasimha Akella, Ananya Honnedevasthana Ashok, Krishnendu Chaudhury, Sujay Venkata Krishna Narumanchi, Devashish Shankar, Ananth Uggirala
  • Publication number: 20190266514
    Abstract: A system comprises one or more observation stations. Each observation station of the one or more observation stations comprises a corresponding set of one or more sensors. Additionally, the system comprises one or more physical machines that implement a computation engine configured to receive first observation data from the one or more observation stations. The computation engine may use the first observation data to train a machine learning system. The computation engine may subsequently use the trained machine learning system to provide feedback regarding an additional instance of the observation subject. The computation engine outputs the feedback.
    Type: Application
    Filed: July 21, 2017
    Publication date: August 29, 2019
    Inventors: Prasad Narasimha Akella, Bhaskar Ramamurthy, Manish Kothari, John Peter Marcotullio
  • Publication number: 20190138973
    Abstract: Embodiments of the present invention provide a machine and continuous data set including process data, quality data, specific actor data, and ergonomic data (among others) to create more accurate job assignments that maximize efficiency, quality and worker safety. Using the data set, tasks may be assigned to actors based on objective statistical data such as skills, task requirements, ergonomics and time availability. Assigning tasks in this way can provide unique value for manufacturers who currently conduct similar analyses using only minimal observational data.
    Type: Application
    Filed: November 5, 2018
    Publication date: May 9, 2019
    Inventors: Prasad Narasimha AKELLA, Ananth UGGIRALA, Krishnendu CHAUDHURY, Sameer GUPTA, Sujay Venkata Krishna NARUMANCHI
  • Publication number: 20190138932
    Abstract: The systems and methods provide an action recognition and analytics tool for use in manufacturing, health care services, shipping, retailing and other similar contexts. Machine learning action recognition can be utilized to determine cycles, processes, actions, sequences, objects and or the like in one or more sensor streams. The sensor streams can include, but are not limited to, one or more video sensor frames, thermal sensor frames, infrared sensor frames, and or three-dimensional depth frames. The analytics tool can provide for process validation, anomaly detection and in-process quality assurance.
    Type: Application
    Filed: November 5, 2018
    Publication date: May 9, 2019
    Inventors: Prasad Narasimha AKELLA, Ananya Honnedevasthana ASHOK, Krishnendu CHAUDHURY, Ashish GUPTA, Sujay Venkata Krishna NARUMANCHI, David Scott PRAGER, Devashish SHANKAR, Ananth UGGIRALA
  • Publication number: 20190138880
    Abstract: In one embodiment a method comprises: accessing information associated with a first actor, including sensed activity information associated with an activity space; analyzing the activity information, including analyzing activity of the first actor with respect to a plurality of other actors; and forwarding feedback on the results of the analysis, wherein the results includes identification of a second actor as a replacement actor to replace the first actor, wherein the second actor is one of the plurality of other actors. The activity space can include an activity space associated with performance of a task. The analyzing can comprise: comparing information associated with activity of the first actor within the activity space with anticipated activity of the respective ones of the plurality of the actors within the activity space; and analyzing/comparing deviations between the activity of the first actor and the anticipated activity of the second actor.
    Type: Application
    Filed: November 5, 2018
    Publication date: May 9, 2019
    Inventor: Prasad Narasimha AKELLA
  • Publication number: 20190138971
    Abstract: The systems and methods provide an action recognition and analytics tool for use in manufacturing, health care services, shipping, retailing and other similar contexts. Machine learning action recognition can be utilized to determine cycles, processes, actions, sequences, objects and or the like in one or more sensor streams. The sensor streams can include, but are not limited to, one or more video sensor frames, thermal sensor frames, infrared sensor frames, and or three-dimensional depth frames. The analytics tool can provide for automatic creation of work charts.
    Type: Application
    Filed: November 5, 2018
    Publication date: May 9, 2019
    Inventors: Ananth UGGIRALA, Yash Raj CHHABRA, Zakaria Ibrahim ASSOUL, Krishnendu CHAUDHURY, Prasad Narasimha AKELLA
  • Publication number: 20190138623
    Abstract: The systems and methods provide an action recognition and analytics tool for use in manufacturing, health care services, shipping, retailing and other similar contexts. Machine learning action recognition can be utilized to determine cycles, processes, actions, sequences, objects and or the like in one or more sensor streams. The sensor streams can include, but are not limited to, one or more video sensor frames, thermal sensor frames, infrared sensor frames, and or three-dimensional depth frames. The analytics tool can provide for automatic creation of birth certificates for each instance of a subject product or service. The birth certificate can string together snippets of the sensor streams along with indicators of cycles, processes, action, sequences, objects, parameters and the like captured in the sensor streams.
    Type: Application
    Filed: November 5, 2018
    Publication date: May 9, 2019
    Inventors: Prasad Narasimha AKELLA, Ananya Honnedevasthana ASHOK, Zakaria Ibrahim ASSOUL, Krishnendu CHAUDHURY, Sameer GUPTA, Sujay Venkata Krishna NARUMANCHI, David Scott PRAGER, Devashish SHANKAR, Ananth UGGIRALA, Yash Raj CHHABRA
  • Publication number: 20190138905
    Abstract: The systems and methods provide an action recognition and analytics tool for use in manufacturing, health care services, shipping, retailing, restaurants and other similar contexts. Machine learning action recognition can be utilized to determine cycles, processes, actions, sequences, objects and or the like in one or more sensor streams. The sensor streams can include, but are not limited to, one or more video sensor frames, thermal sensor frames, infrared sensor frames, and or three-dimensional depth frames. The analytics tool can provide for establishing traceability.
    Type: Application
    Filed: November 5, 2018
    Publication date: May 9, 2019
    Inventors: Prasad Narasimha AKELLA, Ananya Honnedevasthana ASHOK, Krishnendu CHAUDHURY, Sujay Venkata Krishna NARUMANCHI, Devashish Shankar, Ananth UGGIRALA
  • Publication number: 20190139441
    Abstract: The systems and methods provide an action recognition and analytics tool for use in manufacturing, health care services, shipping, retailing and other similar contexts. Machine learning action recognition can be utilized to determine cycles, processes, actions, sequences, objects and or the like in one or more senor streams. The sensor streams can include, but are not limited to, one or more video sensor frames, thermal sensor frames, infrared sensor frames, and or three-dimensional depth frames. The analytics tool can provide for contextual training using the one or more sensor streams and machine learning based action recognition.
    Type: Application
    Filed: November 5, 2018
    Publication date: May 9, 2019
    Inventors: Prasad Narasimha AKELLA, Zakaria Ibrahim ASSOUL, Krishnendu CHAUDHURY, Yash Raj CHHABRA, Aditya DALMIA, Sujay Venkata Krishna NARUMANCHI, Chirag RAVINDRA, Ananth UGGIRALA
  • Publication number: 20190138674
    Abstract: Efficient and effective workspace condition analysis systems and methods are presented. In one embodiment, a method comprises: accessing information associated with an activity space, including information on a newly discovered previously unmodeled entity; analyzing the activity information, including activity information associated with the previously unmodeled entity; forwarding feedback on the results of the analysis, including analysis results for the updated modeled information; and utilizing the feedback in a coordinated path plan check process. In one exemplary implementation the coordinated path plan check process comprises: creating a solid/CAD model including updated modeled information; simulating an activity including the updated modeled information; generating a coordinated path plan for entities in the activity space; and testing the coordinated path plan. The coordinated path plan check process can be a success.
    Type: Application
    Filed: November 5, 2018
    Publication date: May 9, 2019
    Inventors: Prasad Narasimha AKELLA, Krishnendu CHAUDHURY, Ananth UGGIRALA
  • Publication number: 20190138676
    Abstract: The systems and methods provide an action recognition and analytics tool for use in manufacturing, health care services, shipping, retailing and other similar contexts. Machine learning action recognition can be utilized to determine cycles, processes, actions, sequences, objects and or the like in one or more sensor streams. The sensor streams can include, but are not limited to, one or more video sensor frames, thermal sensor frames, infrared sensor frames, and or three-dimensional depth frames. The analytics tool can provide for analyzing ergonomic data from the one or more sensor streams.
    Type: Application
    Filed: November 5, 2018
    Publication date: May 9, 2019
    Inventors: Prasad Narasimha AKELLA, Ananya Honnedevasthana ASHOK, Zakaria Ibrahim ASSOUL, Krishnendu CHAUDHURY, Sameer GUPTA, Ananth UGGIRALA
  • Publication number: 20190138967
    Abstract: Workspace coordination systems and methods are presented. A method can comprise: accessing in real time respective information associated with a first actor and a second actor, including sensed activity information; analyzing the information, including analyzing activity of the first actor with respect to a second actor; and forwarding respective feedback based on the results of the analysis. The feedback can includes an individual objective specific to one of either the first actor or the second actor. The feedback includes collective objective with respect to the first actor or the second actor. The analyzing can include automated artificial intelligence analysis. Sensed activity information can be associated with a grid within the activity space. It is appreciated there can be various combinations of actors (e.g., human and device, device and device, human and human, etc.). The feedback can be a configuration layout suggestion. The feedback can be a suggested assignment of a type of actor to an activity.
    Type: Application
    Filed: November 5, 2018
    Publication date: May 9, 2019
    Inventor: Prasad Narasimha AKELLA
  • Publication number: 20190137979
    Abstract: In various embodiments, a method includes receiving one or more sensor streams with an engine. The engine identifies one or more actions that are performed at first and second stations of a plurality of stations within the sensor stream(s). The received sensor stream(s) and identified one or more actions performed at the first and second stations are stored in a data structure. The identified one or more actions are mapped to the sensor stream(s). The engine characterizes each of the identified one or more actions performed at each of the first and second stations to produce determined characterizations thereof. Based on one or more of the determined characterizations, automatically producing a recommendation, either dynamically or post-facto, to move at least one of the identified one or more actions performed at one of the stations to another station to reduce cycle time.
    Type: Application
    Filed: November 5, 2018
    Publication date: May 9, 2019
    Inventors: Prasad Narasimha AKELLA, Krishnendu CHAUDHURY, Sameer GUPTA, Ananth UGGIRALA