Patents by Inventor Inbar Shani
Inbar Shani has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11436133Abstract: Example implementations relate to comparable UI object identifications. Some implementations may include a data capture engine to capture data points during test executions of the application under test. The data points may include, for example, test action data and application action data. Additionally, some implementations may include a data correlation engine to correlate each of the data points with a particular test execution of the test executions, and each of the data points may be correlated based on a sequence of events that occurred during the particular test execution. Furthermore, some implementations may also automatically identify, based on the correlated data points, a set of comparable UI objects.Type: GrantFiled: March 23, 2016Date of Patent: September 6, 2022Assignee: Micro Focus LLCInventors: Inbar Shani, Ilan Shufer, Amichai Nitsan
-
Patent number: 11119899Abstract: Example implementations relate to determining potential test actions. Some implementations may include a data capture engine to capture data points during test executions of the application under test. The data points may include, for example, test action data and application action data. Additionally, some implementations may include a data correlation engine to correlate each of the data points with a particular test execution of the test executions, and each of the data points may be correlated based on a sequence of events that occurred during the particular test execution. Furthermore, some implementations may also include a test verification engine to determine, based on the correlation of the data points, a potential test action to perform during a future test execution of the application under test.Type: GrantFiled: May 28, 2015Date of Patent: September 14, 2021Assignee: MICRO FOCUS LLCInventors: Inbar Shani, Amichai Nitsan, Yaron Burg
-
Patent number: 11016867Abstract: Example implementations relate to test execution comparisons. Some implementations may include a data capture engine to capture data points during test executions of the application under test. The data points may include, for example, test action data and application action data. Additionally, some implementations may include a data correlation engine to correlate each of the data points with a particular test execution of the test executions, and each of the data points may be correlated based on a sequence of events that occurred during the particular test execution. Furthermore, some implementations may also automatically compare the test executions, based on the correlated data points, to identify commonalities.Type: GrantFiled: December 18, 2015Date of Patent: May 25, 2021Assignee: Micro Focus LLCInventors: Inbar Shani, Amichai Nitsan, Yaron Burg
-
Patent number: 10860458Abstract: In one example of the disclosure, a user-defined success criterion for an application change is received. The criterion is provided to a computing system associated with a developer-user of the application. Evaluation code, for evaluating implementation of the change according to the criterion, is received from the computing system. The evaluation code is caused to execute responsive to receipt of a notice of production deployment of the change. A success rating for the change is determined based upon application performance data attained via execution of the evaluation code.Type: GrantFiled: July 31, 2014Date of Patent: December 8, 2020Assignee: MICRO FOCUS LLCInventors: Inbar Shani, Gil Pearl, Amichai Nitsan
-
Patent number: 10860448Abstract: Example implementations relate to determining a functional state of a system under test. For example, a system to determine a functional state of a system under test may include a system controller to execute a functional test of the system under test by invoking a subset of a plurality of functional agents to interact with the system under test. Further, the system may include an agent repository to interact with the system controller and store the plurality of functional agents. Also, the system may include a state module to determine a functional state for the system under test by querying each of the subset of functional agents and comparing aggregated results from the subset of functional agents against defined for the system under test.Type: GrantFiled: January 13, 2016Date of Patent: December 8, 2020Assignee: MICRO FOCUS LLCInventors: Inbar Shani, Eli Mordechai, Jonathan Taragin, Iris Sasson
-
Publication number: 20200310952Abstract: Example implementations relate to comparable UI object identifications. Some implementations may include a data capture engine to capture data points during test executions of the application under test. The data points may include, for example, test action data and application action data. Additionally, some implementations may include a data correlation engine to correlate each of the data points with a particular test execution of the test executions, and each of the data points may be correlated based on a sequence of events that occurred during the particular test execution. Furthermore, some implementations may also automatically identify, based on the correlated data points, a set of comparable UI objects.Type: ApplicationFiled: March 23, 2016Publication date: October 1, 2020Inventors: Inbar Shani, llan Shufer, Amichai Nitsan
-
Patent number: 10545775Abstract: An application process can be executed based on an initialization instruction, where the application process includes instructions associated with a hook framework. A virtual machine configured to load the hook framework on the virtual machine based on instructions included in the application process can be initiated and the instructions associated with the hook framework can be executed upon initiation of the virtual machine to insert a hook on the virtual machine. A nascent process configured to initiate an additional virtual machine can be initiated based on a request to load an application, where the additional virtual machine is hooked via the hook inserted on the virtual machine.Type: GrantFiled: June 28, 2013Date of Patent: January 28, 2020Assignee: MICRO FOCUS LLCInventors: Inbar Shani, Sigal Maon, Amichai Nitsan
-
Patent number: 10534700Abstract: Example implementations relate to separating verifications from test executions. Some implementations may include a data capture engine that captures data points during test executions of the application under test. The data points may include, for example, application data, test data, and environment data. Additionally, some implementations may include a data correlation engine that correlates each of the data points with a particular test execution state of the application under test based on a sequence of events that occurred during the particular test execution state. Furthermore, some implementations may also include a test verification engine that, based on the correlation of the data points, verifies an actual behavior of the application under test separately from the particular test execution state.Type: GrantFiled: December 9, 2014Date of Patent: January 14, 2020Assignee: MICRO FOCUS LLCInventors: Inbar Shani, Ilan Shufer, Amichai Nitsan
-
Patent number: 10528456Abstract: Example implementations relate to determining idle testing periods. Some implementations may include a data capture engine to capture data points during test executions of the application under test. The data points may include, for example, test action data and application action data. Additionally, some implementations may include a data correlation engine to correlate each of the data points with a particular test execution of the test executions, and each of the data points may be correlated based on a sequence of events that occurred during the particular test execution. Furthermore, some implementations may also include an idle testing period determination engine to determine, based on the correlation of the data points, idle testing periods of the test executions. The idle testing periods may be periods of time where both the test executions and the application under test are idle.Type: GrantFiled: May 4, 2015Date of Patent: January 7, 2020Assignee: Micro Focus LLCInventors: Inbar Shani, Amichai Nitsan, Yaron Burg
-
Patent number: 10509719Abstract: Example implementations relate to automatically identifying regressions. Some implementations may include a data capture engine to capture data points during test executions of the application under test. The data points may include, for example, test action data and application action data. Additionally, some implementations may include a data correlation engine to correlate each of the data points with a particular test execution of the test executions, and each of the data points may be correlated based on a sequence of events that occurred during the particular test execution. Furthermore, some implementations may also include a regression identification engine to automatically identify, based on the correlated data points, a regression between a first version of the application under test and a second version of the application under test.Type: GrantFiled: September 8, 2015Date of Patent: December 17, 2019Assignee: MICRO FOCUS LLCInventors: Inbar Shani, Ayal Cohen, Yaron Burg
-
Patent number: 10496512Abstract: Error data may be collected. The error data may represent a first plurality of errors of a first type and a second plurality of errors of a second type to occur in a plurality of instances of an application transaction. Visualization data may be generated. The visualization data may represent an error flow diagram to display on an output device. The error flow diagram may comprise a first block having a first visual property based on a first number of the first plurality of errors, a second block having a second visual property based on a second number of the second plurality of errors, and a first linkage between the first block and the second block.Type: GrantFiled: May 11, 2016Date of Patent: December 3, 2019Assignee: Micro Focus LLCInventors: Amichai Nitsan, Inbar Shani
-
Patent number: 10365995Abstract: Example implementations relate to composing future tests. Some implementations may include a data capture engine to capture data points during test executions of the application under test. The data points may include, for example, test action data and application action data. Additionally, some implementations may include a data correlation engine to correlate each of the data points with a particular test execution of the test executions, and each of the data points may be correlated based on a sequence of events that occurred during the particular test execution. Furthermore, some implementations may also include a test composition engine to compose, based on an interaction with a visualization of results of a verification query of the correlated data points, a future test of the application under test.Type: GrantFiled: August 4, 2015Date of Patent: July 30, 2019Assignee: ENTIT SOFTWARE LLCInventors: Inbar Shani, Amichai Nitsan, Yaron Burg
-
Patent number: 10360140Abstract: Example embodiments relate to determining code coverage based on production sampling. In example embodiments, a production execution data set that includes metrics for code units of a software application is obtained, where the metrics include input and output values for each of the code units and an average execution count for each of the code units. Further, application code execution is tracked during a testing procedure of the software application to determine executed lines of code. At this stage, production code coverage of the software application is determined based on the production execution data set and the executed lines of code.Type: GrantFiled: November 27, 2013Date of Patent: July 23, 2019Assignee: ENTIT SOFTWARE LLCInventors: Boaz Shor, Gil Pearl, Ohad Assulin, Inbar Shani
-
Patent number: 10324710Abstract: Examples disclosed herein relate to indicating a trait of a continuous delivery pipeline. Examples include accessing, for each of a plurality of continuous delivery (CD) pipelines, respective pipeline characteristics previously collected by a collection engine of a CD server for at least one of the CD pipelines, and indicating a trait of the pipeline characteristics of at least one of the CD pipelines.Type: GrantFiled: November 15, 2013Date of Patent: June 18, 2019Assignee: ENTIT SOFTWARE LLCInventors: Inbar Shani, Lior Reuven, Amichai Nitsan
-
Patent number: 10289406Abstract: An example method for handling dependencies between feature flags can include defining, by a processing resource executing instructions, dependencies between a plurality of feature flags in a process executable by the processing resource. The method can include enforcing, by the processing resource executing instructions, the dependencies during activation of a first feature by a determination of validity of utilization of a feature flag as a switch for a second feature.Type: GrantFiled: April 30, 2013Date of Patent: May 14, 2019Assignee: ENTIT SOFTWARE LLCInventors: Inbar Shani, Sharon Lin, Yael Oshri
-
Publication number: 20190079786Abstract: Example implementations relate to creating a simulation environment. For example, a system for simulation environment creation may include a system controller. The system may also include a functional agent coupled to and executed by the system controller to create a simulation environment based on input comprising real user data and sensor data continuously received from a plurality of sources, perform tests within the simulation environment, continuously react to events occurring during the tests within the simulation environment, and continuously verify results of the tests.Type: ApplicationFiled: March 7, 2016Publication date: March 14, 2019Inventors: Inbar SHANI, Jonathan TARAGIN, Eli MORDECHAI
-
Publication number: 20190018748Abstract: Example implementations relate to determining a functional state of a system under test. For example, a system to determine a functional state of a system under test may include a system controller to execute a functional test of the system under test by invoking a subset of a plurality of functional agents to interact with the system under test. Further, the system may include an agent repository to interact with the system controller and store the plurality of functional agents. Also, the system may include a state module to determine a functional state for the system under test by querying each of the subset of functional agents and comparing aggregated results from the subset of functional agents against defined for the system under test.Type: ApplicationFiled: January 13, 2016Publication date: January 17, 2019Inventors: Inbar Shani, Eli Mordechai, Jonathan Taragin, Iris Sasson
-
Patent number: 10175958Abstract: Examples disclosed herein relate to acquiring identification of an application lifecycle management (ALM) entity associated with similar code. Examples include identifying a target code segment, and acquiring, from an ALM system, identification of an ALM entity associated with other code similar to the target code segment and identified by a code similarity system.Type: GrantFiled: January 30, 2013Date of Patent: January 8, 2019Assignee: ENTIT SOFTWARE LLCInventors: Inbar Shani, Yaron Burg, Amichai Nitsan
-
Patent number: 10169216Abstract: Simulating sensors can include hooking an application associated with sensory data and associating the sensory data with an automation instruction. Simulating sensors can include providing the sensory data to a support device having an ability to modify the application and automatically causing the support device to simulate a sensory input using the sensory data by executing the automation instruction.Type: GrantFiled: June 28, 2013Date of Patent: January 1, 2019Assignee: ENTIT SOFTWARE LLCInventors: Inbar Shani, Amichai Nitsan, Sigal Maon
-
Publication number: 20180365123Abstract: Example implementations relate to test execution comparisons. Some implementations may include a data capture engine to capture data points during test executions of the application under test. The data points may include, for example, test action data and application action data. Additionally, some implementations may include a data correlation engine to correlate each of the data points with a particular test execution of the test executions, and each of the data points may be correlated based on a sequence of events that occurred during the particular test execution. Furthermore, some implementations may also automatically compare the test executions, based on the correlated data points, to identify commonalities.Type: ApplicationFiled: December 18, 2015Publication date: December 20, 2018Inventors: Inbar Shani, Amichai Nitsan, Yaron Burg