Patents by Inventor Inbar Shani

Inbar Shani has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10102105
    Abstract: In one example of the disclosure, code lines for a software program are received, the code lines including a unit of code lines. Code entities within the unit are identified. Each code entity includes a line or consecutive lines of code implementing a distinct program requirement or defect fix for the program. Context changes are identified within the unit, each context change including an occurrence of a first code line set implementing an entity, adjacent to a second code line set implementing another entity, within a same code scope. A code complexity score is determined based upon counts of entities identified and context changes identified within the unit, and upon counts of code lines and entities within the program.
    Type: Grant
    Filed: June 24, 2014
    Date of Patent: October 16, 2018
    Assignee: ENTIT SOFTWARE LLC
    Inventors: Inbar Shani, Ohad Assulin, Yaron Burg
  • Publication number: 20180267888
    Abstract: Example implementations relate to automatically identifying regressions. Some implementations may include a data capture engine to capture data points during test executions of the application under test. The data points may include, for example, test action data and application action data. Additionally, some implementations may include a data correlation engine to correlate each of the data points with a particular test execution of the test executions, and each of the data points may be correlated based on a sequence of events that occurred during the particular test execution. Furthermore, some implementations may also include a regression identification engine to automatically identify, based on the correlated data points, a regression between a first version of the application under test and a second version of the application under test.
    Type: Application
    Filed: September 8, 2015
    Publication date: September 20, 2018
    Inventors: Inbar Shani, Ayal Cohen, Yaron Burg
  • Patent number: 10073755
    Abstract: Example embodiments relate to tracing source code for end user monitoring. In example embodiments, an application is monitored to obtain an interaction log, where the interaction log tracks application interactions by each of a plurality of synthetic monitors. Further, an execution of application code that is associated with the application is monitored to obtain an instrumentation log. At this stage, the interaction log and the instrumentation log are used to determine relationships between portions of the application code and the plurality of synthetic monitors. A notification of a modification to the application is received, and an affected subset of the synthetic monitors that are affected by the modification are identified based on the relationships.
    Type: Grant
    Filed: September 30, 2013
    Date of Patent: September 11, 2018
    Assignee: ENTIT SOFTWARE LLC
    Inventors: Inbar Shani, Gil Perel, Guy Offer
  • Publication number: 20180232299
    Abstract: Example implementations relate to composing future tests. Some implementations may include a data capture engine to capture data points during test executions of the application under test. The data points may include, for example, test action data and application action data. Additionally, some implementations may include a data correlation engine to correlate each of the data points with a particular test execution of the test executions, and each of the data points may be correlated based on a sequence of events that occurred during the particular test execution. Furthermore, some implementations may also include a test composition engine to compose, based on an interaction with a visualization of results of a verification query of the correlated data points, a future test of the application under test.
    Type: Application
    Filed: August 4, 2015
    Publication date: August 16, 2018
    Inventors: Inbar Shani, Amichai Nitsan, Yaron Burg
  • Publication number: 20180143897
    Abstract: Example implementations relate to determining idle testing periods. Some implementations may include a data capture engine to capture data points during test executions of the application under test. The data points may include, for example, test action data and application action data. Additionally, some implementations may include a data correlation engine to correlate each of the data points with a particular test execution of the test executions, and each of the data points may be correlated based on a sequence of events that occurred during the particular test execution. Furthermore, some implementations may also include an idle testing period determination engine to determine, based on the correlation of the data points, idle testing periods of the test executions. The idle testing periods may be periods of time where both the test executions and the application under test are idle.
    Type: Application
    Filed: May 4, 2015
    Publication date: May 24, 2018
    Applicant: ENTIT Software LLC
    Inventors: Inbar Shani, Amichai Nitsan, Yaron Burg
  • Publication number: 20180137036
    Abstract: Example implementations relate to determining potential test actions. Some implementations may include a data capture engine to capture data points during test executions of the application under test. The data points may include, for example, test action data and application action data. Additionally, some implementations may include a data correlation engine to correlate each of the data points with a particular test execution of the test executions, and each of the data points may be correlated based on a sequence of events that occurred during the particular test execution. Furthermore, some implementations may also include a test verification engine to determine, based on the correlation of the data points, a potential test action to perform during a future test execution of the application under test.
    Type: Application
    Filed: May 28, 2015
    Publication date: May 17, 2018
    Inventors: Inbar Shani, Amichai Nitsan, Yaron BURG
  • Publication number: 20170371727
    Abstract: Examples relate to execution of interaction flows. The examples disclosed herein enable obtaining, via a user interface of a local client computing device, an interaction flow that defines an order of execution of a plurality of interaction points and values exchanged among the plurality of interaction points, the plurality of interaction points comprising a first interaction point that indicates an event executed by an application; triggering the execution of the interaction flow; determining whether any of remote client computing devices that are in communication with the local client computing device includes the application; and causing the first interaction point to be executed by the application in at least one of the remote client computing devices that are determined to include the application.
    Type: Application
    Filed: December 22, 2014
    Publication date: December 28, 2017
    Inventors: Inbar Shani, Olga Kogan, Amichai Nitsan
  • Publication number: 20170329664
    Abstract: Error data may be collected. The error data may represent a first plurality of errors of a first type and a second plurality of errors of a second type to occur in a plurality of instances of an application transaction. Visualization data may be generated. The visualization data may represent an error flow diagram to display on an output device. The error flow diagram may comprise a first block having a first visual property based on a first number of the first plurality of errors, a second block having a second visual property based on a second number of the second plurality of errors, and a first linkage between the first block and the second block.
    Type: Application
    Filed: May 11, 2016
    Publication date: November 16, 2017
    Inventors: Amichai Nitsan, Inbar Shani
  • Patent number: 9792202
    Abstract: Examples disclosed herein relate to identifying a configuration element value as a potential cause of a testing operation failure. Examples include causing a testing operation to be performed approximately in parallel on each of a plurality of instances of an application executed in respective testing environments, acquiring configuration element values from each of the testing environments, and identifying at least one of the configuration element values as a potential cause of a testing operation failure.
    Type: Grant
    Filed: November 15, 2013
    Date of Patent: October 17, 2017
    Assignee: ENTIT SOFTWARE LLC
    Inventors: Inbar Shani, Roy Nuriel, Amichai Nitsan
  • Publication number: 20170293551
    Abstract: Example implementations relate to separating verifications from test executions. Some implementations may include a data capture engine that captures data points during test executions of the application under test. The data points may include, for example, application data, test data, and environment data. Additionally, some implementations may include a data correlation engine that correlates each of the data points with a particular test execution state of the application under test based on a sequence of events that occurred during the particular test execution state. Furthermore, some implementations may also include a test verification engine that, based on the correlation of the data points, verifies an actual behavior of the application under test separately from the particular test execution state.
    Type: Application
    Filed: December 9, 2014
    Publication date: October 12, 2017
    Applicant: Hewlett Packard Enterprise Development LP
    Inventors: Inbar Shani, Ilan Shufer, Amichai Nitsan
  • Patent number: 9753750
    Abstract: A method to manage a global feature library is provided herein. The method provides a global feature library with a plurality of features defined therein. The global feature library includes a feature switch for each of the plurality of features. The feature switch is useable with an application code. The feature switch is linked to an application code. The feature switch includes a feature value that turns a feature associated with the feature switch on and off based on a global value rule. The global rule is transmitted to a client device capable of storing the feature value in-memory.
    Type: Grant
    Filed: August 30, 2012
    Date of Patent: September 5, 2017
    Assignee: ENTIT SOFTWARE LLC
    Inventors: Inbar Shani, Amichai Nitsan, Lior Manor
  • Publication number: 20170228312
    Abstract: In some examples, a system provides a plurality of code changes of a program in a continuous deployment pipeline that performs tests on the plurality of code changes and deploys a code assembly including the plurality of code changes. The system ranks a plurality of tests based on a test attribute and a test context to provide a test rank for each of the plurality of tests, and selects, from among the plurality of tests using the test ranks, a subset of tests to include in a test set for the continuous deployment pipeline. The system executes the test set in the continuous deployment pipeline to test the plurality of code changes.
    Type: Application
    Filed: April 24, 2017
    Publication date: August 10, 2017
    Inventors: Inbar Shani, Amichai Nitsan, Sigal Maon
  • Patent number: 9703687
    Abstract: A monitor that monitors an application is provided herein. The monitor provides a monitor function that monitors an application. The monitor embeds the monitor function in meta data of a code base of the application. The monitor function embedded in the code base is activated. The monitor function is used to monitor data associated therewith during deployment of the application.
    Type: Grant
    Filed: September 21, 2012
    Date of Patent: July 11, 2017
    Assignee: HEWLETT PACKARD ENTERPRISE DEVELOPMENT LP
    Inventors: Inbar Shani, Amichai Nitsan, Eli Mordechai
  • Publication number: 20170192875
    Abstract: In one example of the disclosure, a user-defined success criterion for an application change is received. The criterion is provided to a computing system associated with a developer-user of the application. Evaluation code, for evaluating implementation of the change according to the criterion, is received from the computing system. The evaluation code is caused to execute responsive to receipt of a notice of production deployment of the change. A success rating for the change is determined based upon application performance data attained via execution of the evaluation code.
    Type: Application
    Filed: July 31, 2014
    Publication date: July 6, 2017
    Applicant: HEWLETT PACKARD ENTERPRISE DEVELOPMENT LP
    Inventors: Inbar Shani, Gil Pearl, Amichai Nitsan
  • Patent number: 9652509
    Abstract: A method to prioritize a plurality of tests in a continuous deployment pipeline. The method ranks the plurality of tests based on a test attribute and a test context to provide a test rank for each of the plurality of tests. The method sets a test set for the continuous deployment pipeline using the test ranks. The method executes the test set in the continuous deployment pipeline.
    Type: Grant
    Filed: April 30, 2012
    Date of Patent: May 16, 2017
    Assignee: Hewlett Packard Enterprise Development LP
    Inventors: Inbar Shani, Amichai Nitsan, Sigal Maon
  • Publication number: 20170083217
    Abstract: In one implementation, a system for user interface components load time visualization includes a load engine to monitor a load time of a number of elements of a user interface, a color engine to assign a color to each of the number of elements of the user interface based on the load time, and a compile engine to display a component color map of the user interface utilizing the color assigned to each of the number of elements.
    Type: Application
    Filed: September 8, 2014
    Publication date: March 23, 2017
    Inventors: Amichai Nitsan, Haim Shuvali, Inbar Shani
  • Publication number: 20170053222
    Abstract: Role based assessment for an IT management system, includes maintaining a plurality of roles, each role attributable to a user type within an IT management system. Mappings are defined between the plurality of user roles and assets of the IT management system. An assessment for the IT management system is then assembled from the perspective of a selected one of the plurality of roles based on mappings between the selected user role and the assets.
    Type: Application
    Filed: February 19, 2014
    Publication date: February 23, 2017
    Inventors: Inbar Shani, Amichai Nitsan, Dror Saaroni
  • Publication number: 20170031800
    Abstract: In one example of the disclosure, code lines for a software program are received, the code lines including a unit of code lines. Code entities within the unit are identified. Each code entity includes a line or consecutive lines of code implementing a distinct program requirement or defect fix for the program. Context changes are identified within the unit, each context change including an occurrence of a first code line set implementing an entity, adjacent to a second code line set implementing another entity, within a same code scope. A code complexity score is determined based upon counts of entities identified and context changes identified within the unit, and upon counts of code lines and entities within the program.
    Type: Application
    Filed: June 24, 2014
    Publication date: February 2, 2017
    Applicant: HEWLETT PACKARD ENTERPRISE DEVELOPMENT LP
    Inventors: Inbar Shani, Ohad Assulin, Yaron Burg
  • Publication number: 20160283355
    Abstract: Examples disclosed herein relate to identifying a configuration element value as a potential cause of a testing operation failure. Examples include causing a testing operation to be performed approximately in parallel on each of a plurality of instances of an application executed in respective testing environments, acquiring configuration element values from each of the testing environments, and identifying at least one of the configuration element values as a potential cause of a testing operation failure.
    Type: Application
    Filed: November 15, 2013
    Publication date: September 29, 2016
    Inventors: Inbar Shani, Roy Nuriel, Amichai Nitsan
  • Publication number: 20160259714
    Abstract: Example embodiments relate to determining code coverage based on production sampling. In example embodiments, a production execution data set that includes metrics for code units of a software application is obtained, where the metrics include input and output values for each of the code units and an average execution count for each of the code units. Further, application code execution is tracked during a testing procedure of the software application to determine executed lines of code. At this stage, production code coverage of the software application is determined based on the production execution data set and the executed lines of code.
    Type: Application
    Filed: November 27, 2013
    Publication date: September 8, 2016
    Inventors: Boaz Shor, Gil Perel, Ohad Assulin, Inbar Shani