Patents by Inventor Mithilesh Kumar Singh
Mithilesh Kumar Singh has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11809512Abstract: Provided are systems and methods for converting user interface events that occur in a software application developed via a WebGUI framework into activity descriptions and into a bot software program. In one example, a method may include recording events transmitted between a user interface of a web application open within a web browser on a client device and a back-end of the web application on a server, identifying codes associated with user interface elements which are assigned to the recorded events, converting the identified codes assigned to the recorded events into a human-readable descriptions of the recorded events based on a predefined mapping between the codes and the human-readable descriptions stored in the predefined mapping, and displaying the human-readable descriptions of the recorded events based on the identified order among the recorded events.Type: GrantFiled: December 14, 2021Date of Patent: November 7, 2023Assignee: SAP SEInventors: Satyadeep Dey, Vinay Kumar, Sharmika Parmar, Sudha Karanam Narasimha Murthy, Chandrakanth S, Mithilesh Kumar Singh, Suvajit Dutta
-
Publication number: 20230185867Abstract: Provided are systems and methods for converting user interface events that occur in a software application developed via a WebGUI framework into activity descriptions and into a bot software program. In one example, a method may include recording events transmitted between a user interface of a web application open within a web browser on a client device and a back-end of the web application on a server, identifying codes associated with user interface elements which are assigned to the recorded events, converting the identified codes assigned to the recorded events into a human-readable descriptions of the recorded events based on a predefined mapping between the codes and the human-readable descriptions stored in the predefined mapping, and displaying the human-readable descriptions of the recorded events based on the identified order among the recorded events.Type: ApplicationFiled: December 14, 2021Publication date: June 15, 2023Inventors: Satyadeep Dey, Vinay Kumar, Sharmika Parmar, Sudha Karanam Narasimha Murthy, Chandrakanth S., Mithilesh Kumar Singh, Suvajit Dutta
-
Publication number: 20230188591Abstract: Provided are systems and methods for recording user interface events that occur in a software application developed via a WebGUI framework. In one example, a method may include establishing a session between a front-end of a web application open within a web browser on a client device and a back-end of the web application hosted on a server, activating a recorder via a web extension of the web browser of the client device based on attributes of the established session, capturing user interface events transmitted between the front-end of the application within the web browser on the client device and the back-end of the application hosted on the server via the activated recorder, and recording the captured user interface events in a file.Type: ApplicationFiled: December 14, 2021Publication date: June 15, 2023Inventors: Satyadeep Dey, Vinay Kumar, Sharmika Parmar, Sudha Karanam Narasimha Murthy, Chandrakanth S, Mithilesh Kumar Singh, Suvajit Dutta, Arno Esser
-
Publication number: 20230185586Abstract: Provided are systems and methods for batching instructions of a bot during execution/runtime of the bot. The bot may be a software program that is designed to perform user interface interactions (e.g., button clicks, opening/closing pages, text entry, etc.) in place of a human. In one example, a method may include receiving a request to execute a bot program configured to perform a sequence of actions on a user interface of a software application, identifying a plurality of actions of the bot program that can be batched, assembling a plurality of instructions for performing the plurality of actions of the bot program into a batched payload, and transmitting an automation request with the batched payload to a back-end of the application on a server.Type: ApplicationFiled: December 14, 2021Publication date: June 15, 2023Inventors: Satyadeep Dey, Vinay Kumar, Sharmika Parmar, Sudha Karanam Narasimha Murthy, Gagan K., Chandrakanth S., Mithilesh Kumar Singh, Suvajit Dutta
-
Publication number: 20230185869Abstract: Provided are systems and methods for automatically detecting a change in screen content and generating a hint message in response. As another example, the system may auto-capture the detected change in the screen content instead of or in addition to the hint message. In one example, a method may include capturing user interface metadata of content being displayed by a client-side of the software application, activating a recorder that is configured to record events transmitted between the client-side and a server-side of the software application, receiving updated user interface metadata based on a user interaction on the client-side of the software application, determining that user interface content has changed based on a comparison of the captured user interface metadata to the updated user interface metadata, and displaying a hint message via the user interface of the client-side of the software application.Type: ApplicationFiled: December 14, 2021Publication date: June 15, 2023Inventors: Chandrakanth S, Sudha Karanam Narasimha Murthy, Sharmika Parmar, Suvajit Dutta, Vinay Kumar, Satyadeep Kumar Dey, Mithilesh Kumar Singh
-
Publication number: 20220382424Abstract: Aspects of the present disclosure provide techniques for application navigation recommendations using machine learning. Embodiments include determining one or more pages accessed by a user within an application. Embodiments include providing one or more inputs to a machine learning model based on the one or more pages accessed by the user. Embodiments include receiving, from the machine learning model based on the one or more inputs, one or more predicted pages. Embodiments include displaying, in a user interface, one or more elements recommending the one or more predicted pages to the user. Embodiments include receiving a selection of a given element of the one or more elements. Embodiments include navigating within the user interface, based on the selection, to a given page of the one or more predicted pages that corresponds to the given element.Type: ApplicationFiled: May 18, 2022Publication date: December 1, 2022Inventors: Deepankar MOHAPATRA, Ronnie Douglas DOUTHIT, Mithilesh Kumar SINGH, Manish Omprakash BHATIA, Jessica Colleen DANBY, Somin HEO
-
Publication number: 20220237404Abstract: Disclosed herein are system, method, and computer program product embodiments for surface automation in black box environments. An embodiment operates by determining scenarios of an application for automation; detecting the scenario during an execution of an application; capturing and storing one or more user interface screenshots of the scenario; identifying and storing user interface information from the user interface screenshot; implementing a sequential set of instructions comprising at least one textual element detection technique and at least one non-textual element detection technique; and executing the sequential set of instructions.Type: ApplicationFiled: January 25, 2021Publication date: July 28, 2022Inventors: Mithilesh Kumar Singh, Anubhav Sadana, Deepak Pandian, Raghavendra D, Satyadeep Dey, Phillippe Long
-
Publication number: 20220083907Abstract: A data annotation server accesses a request from a machine learning server for annotated images of a user interface containing a specified user interface element. The data annotation server programmatically determines whether user interfaces generated by an application server include the specified user interface element. If so, an image of the user interface is stored and a location or bounding box of the user interface element is determined. The stored image of the user interface is annotated with the determined location of the user interface element. The image and the annotation are provided to the machine learning server, which uses the images and annotations to train a machine learning model.Type: ApplicationFiled: September 17, 2020Publication date: March 17, 2022Inventor: Mithilesh Kumar Singh
-
Publication number: 20220044111Abstract: In an example embodiment, actionable flows are found from customer tickets by autonomously reading and understanding customer queries using neural networks. Specifically, in an example embodiment, a deep neural network is trained to be utilized at two separate stages in a flow generation process. In one stage, the neural network is used to identify a list of repetitive queries from a repository of customer tickets. In another stage, the neural network is used to identify actionable flows from query steps obtained from the list of repetitive queries.Type: ApplicationFiled: August 7, 2020Publication date: February 10, 2022Inventors: Mithilesh Kumar Singh, Mohammad Saad Rashid, Warren Mark Fernandes
-
Patent number: 10970097Abstract: A computer-implemented method can receive a request from a robotic process automation engine to identify a target user interface control element in a webpage represented by a current master data frame. The current master data frame comprises a current document object model (DOM). The method can determine that a target user interface control element identifier associated with the target user interface control element is absent in the current DOM. The method can retrieve an archived version of the target user interface control element from an archived master data frame of the webpage. The method can find an equivalent user interface control element within the current master data frame based at least on the archived version of the target user interface control element, and output an equivalent user interface control element identifier associated with the equivalent user interface control element.Type: GrantFiled: August 29, 2019Date of Patent: April 6, 2021Assignee: SAP SEInventors: Mohammad Saad Rashid, Warren Mark Fernandes, Mithilesh Kumar Singh, Sonam Saxena, Sai Phani Sharath Chandra Danthalapelli
-
Patent number: 10949225Abstract: The present disclosure involves systems, software, and computer implemented methods for automatically detecting user interface elements. One example method includes accessing master frame information for a user interface of an application. The master frame information includes first captured user interface information captured during a first execution of a scenario for the application. A subsequent execution of the scenario is performed, including capturing second captured user interface information. A determination is made that the subsequent execution of the scenario has not passed successfully, due to a non-functional error. A determination is made that the non-functional error is based on a failure to locate a user interface element specified in the master frame. A recovery strategy is performed, using the first captured user interface information and the second captured user interface information, to automatically locate the user interface element.Type: GrantFiled: February 6, 2019Date of Patent: March 16, 2021Assignee: SAP SEInventors: Warren Mark Fernandes, Mohammad Saad Rashid, Sai Phani Sharath Chandra Danthalapelli, Sonam Saxena, Mithilesh Kumar Singh
-
Publication number: 20200401431Abstract: A computer-implemented method can receive a request from a robotic process automation engine to identify a target user interface control element in a webpage represented by a current master data frame. The current master data frame comprises a current document object model (DOM). The method can determine that a target user interface control element identifier associated with the target user interface control element is absent in the current DOM. The method can retrieve an archived version of the target user interface control element from an archived master data frame of the webpage. The method can find an equivalent user interface control element within the current master data frame based at least on the archived version of the target user interface control element, and output an equivalent user interface control element identifier associated with the equivalent user interface control element.Type: ApplicationFiled: August 29, 2019Publication date: December 24, 2020Applicant: SAP SEInventors: Mohammad Saad Rashid, Warren Mark Fernandes, Mithilesh Kumar Singh, Sonam Saxena, Sai Phani Sharath Chandra Danthalapelli
-
Patent number: 10740221Abstract: In one aspect, there is provided a method for software testing. In one aspect, there is provided a method. The method may include executing a test script including at least one test instruction requiring an input at a user interface element displayed on a screen of a device under test; determining, based on a machine learning model, a candidate location on the screen of the device under test, the candidate location representing a candidate portion of the screen having the user interface element for the required input associated with the at least one test instruction; recognizing, based on optical character recognition, one or more characters in the determined candidate location; selecting, based on the recognized characters, the determined candidate location as the user interface element having the required input; and executing an inserted value at the determined candidate location to test a result of the test script execution.Type: GrantFiled: October 15, 2018Date of Patent: August 11, 2020Assignee: SAP SEInventors: Sonam Saxena, Samir Patil, Warren Mark Fernandes, Sai Phani Sharath Chandra Danthalapelli, Mithilesh Kumar Singh
-
Publication number: 20200249964Abstract: The present disclosure involves systems, software, and computer implemented methods for automatically detecting user interface elements. One example method includes accessing master frame information for a user interface of an application. The master frame information includes first captured user interface information captured during a first execution of a scenario for the application. A subsequent execution of the scenario is performed, including capturing second captured user interface information. A determination is made that the subsequent execution of the scenario has not passed successfully, due to a non-functional error. A determination is made that the non-functional error is based on a failure to locate a user interface element specified in the master frame. A recovery strategy is performed, using the first captured user interface information and the second captured user interface information, to automatically locate the user interface element.Type: ApplicationFiled: February 6, 2019Publication date: August 6, 2020Inventors: Warren Mark Fernandes, Mohammad Saad Rashid, Sai Phani Sharath Chandra Danthalapelli, Sonam Saxena, Mithilesh Kumar Singh
-
Publication number: 20200117577Abstract: In one aspect, there is provided a method for software testing. In one aspect, there is provided a method. The method may include executing a test script including at least one test instruction requiring an input at a user interface element displayed on a screen of a device under test; determining, based on a machine learning model, a candidate location on the screen of the device under test, the candidate location representing a candidate portion of the screen having the user interface element for the required input associated with the at least one test instruction; recognizing, based on optical character recognition, one or more characters in the determined candidate location; selecting, based on the recognized characters, the determined candidate location as the user interface element having the required input; and executing an inserted value at the determined candidate location to test a result of the test script execution.Type: ApplicationFiled: October 15, 2018Publication date: April 16, 2020Inventors: Sonam Saxena, Samir Patil, Warren Mark Fernandes, Sai Phani Sharath Chandra Danthalapelli, Mithilesh Kumar Singh