Patents by Inventor Ofir Ezrielev

Ofir Ezrielev has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230419125
    Abstract: Methods and systems for managing data aggregation in a distributed environment are disclosed. The data may be aggregated using twin inference models which may be used to reduce a quantity of data transmitted to aggregate the data. To obtain twin inference models, models may be trained which may consume computing resources. A computing resource cost for training the twin inference models may be estimated based on an estimated number of twin inferences models necessary to meet inference accuracy goals. A model training device that has an available quantity of computing resources sufficient to meet the computing resource cost may be obtained. The model training device may be used to train and distribute inference models for data aggregation purposes.
    Type: Application
    Filed: June 27, 2022
    Publication date: December 28, 2023
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Publication number: 20230418467
    Abstract: Methods and systems for managing data collection are disclosed. A data aggregator may aggregate data collected by a data collector. To reduce computing resources used for aggregation, the data aggregator and data collector may implement a multi-stage data reduction processes to reduce the quantity of data transmitted for data aggregation purposes. The multi-stage data reduction process may include implementing twin inference models at the aggregator and collector, identifying relationships in the data collected by the data collector using feature relationship inference models, transmitting a portion of the collected data to the data aggregator and withholding a second portion of the collected data based on acceptable level of error for use of the collected data, and reconstructing the withheld portion of the collected data at the aggregator. The reconstructed portion of the collected data may include the acceptable level of error when compared to a corresponding portion of the collected data.
    Type: Application
    Filed: June 27, 2022
    Publication date: December 28, 2023
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Publication number: 20230419131
    Abstract: Methods and systems for managing data collection are disclosed. A data aggregator may aggregate data collected by a data collector. To reduce computing resources used for aggregation, the data aggregator and data collector may use inferences provided by a twin inference model in place of data collected by the data collector rather than receiving copies of data from the data collector. Over time, the aggregated data may be revised using revised inference models that are revised using subsequently obtained data from the data collector. The revised inference models may be used to obtain revised inferences that may replace original inferences in the aggregated data. The revised inferences may be of higher accuracy due to differences in the data upon which the inference and revised inference models are based.
    Type: Application
    Filed: June 27, 2022
    Publication date: December 28, 2023
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Publication number: 20230419135
    Abstract: Methods and systems for managing aggregation of data throughout a distributed environment are disclosed. To manage aggregation of data, a system may include a data aggregator and one or more data collectors. The data aggregator may obtain a threshold, the threshold indicating an acceptable error level associated with a downstream consumer of the aggregated data. The data aggregator may obtain the acceptable error level by simulating operation of the downstream consumer using synthetic data sets. The synthetic data sets may include different levels of error and, therefore, the data aggregator may determine a level of error that may impact the operation of the downstream consumer to an acceptable degree. In order to facilitate data aggregation, an inference model may be implemented that meets the threshold while consuming a minimum quantity of computing resources during operation.
    Type: Application
    Filed: June 27, 2022
    Publication date: December 28, 2023
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Patent number: 11848987
    Abstract: A system can a divide database into a group of shards distributed among a group of data centers, wherein the group of shards comprises respective leader replicas. The system can determine respective correlation values between pairs of shards of the group of shards. The system can examine the pairs of shards in a descending order of respective correlation values, comprising, in response to determining that a respective pair of shards of the pairs of shards has a first correlation value greater than a predetermined threshold value, and that at least one shard of the respective pair of shards is unlocked, reassigning leader replicas of the respective pair of shards to be stored in a same data center of the group of data centers, and locking the leader replicas of the respective pair of shards from being reassigned to another data center of the group of data centers during the examining.
    Type: Grant
    Filed: October 22, 2021
    Date of Patent: December 19, 2023
    Assignee: DELL PRODUCTS, L.P.
    Inventors: Ofir Ezrielev, Nadav Azaria, Yonit Weiss
  • Patent number: 11816341
    Abstract: A Function as a Service (FaaS) distribution system is configured to implement FaaS as a Service (FaaSaaS), enabling autonomous storage systems to be used as FaaS providers during periods where the storage systems are not being used at full capacity to process primary workloads. The FaaS distribution system receives functions from FaaS consumers, and selects a FaaS provider from a set of autonomous storage systems currently able to process FaaS workloads. The FaaS distribution system selects FaaS providers based on an expected execution time of the function and expected execution times of other functions executing on particular FaaS providers, to preferentially select a FaaS provider that is currently running an instance of the function, and to preferentially select a FaaS provider that has other functions that are current executing that are not expected to finish execution at the same time the current function is expected to complete execution.
    Type: Grant
    Filed: January 18, 2022
    Date of Patent: November 14, 2023
    Assignee: Dell Products, L.P.
    Inventors: Ofir Ezrielev, Nadav Azaria, Avitan Gefen
  • Patent number: 11816134
    Abstract: Methods and systems for managing data collection in a distributed system are disclosed. The system may include a data aggregator and a data collector. The data aggregator may aggregate data collected by the data collector. To reduce the computing resources used to aggregate data, the data aggregator and data collector may implement a multi-stage data reduction processes to reduce the quantity of data transmitted for data aggregation purposes. The multi-stage data reduction process may include implementing twin inference models at the aggregator and collector, identifying relationships in the data collected by the data collector, transmitting a portion of the collected data to the data aggregator based on acceptable level of error for use of the collected data, and reconstructing the un-transmitted portion of the collected data at the aggregator. The reconstructed portion of the collected data may include the acceptable level of error.
    Type: Grant
    Filed: June 27, 2022
    Date of Patent: November 14, 2023
    Assignee: Dell Products L.P.
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Patent number: 11811862
    Abstract: Methods and systems for managing workloads are disclosed. The workloads may be supported by operation of workload components that are hosted by infrastructure. The hosted locations of the workload components by the infrastructure may impact the performance of the workloads. To manage performance of the workloads, an optimization process may be performed to identify a migration plan for migrating some of the workload components to different infrastructure locations. Some of the different infrastructure locations may reduce computing resource cost for performance of the workloads.
    Type: Grant
    Filed: April 26, 2023
    Date of Patent: November 7, 2023
    Assignee: Dell Products L.P.
    Inventors: Ofir Ezrielev, Lior Gdaliahu, Roman Bober, Yonit Lopatinski, Eliyahu Rosenes
  • Publication number: 20230351242
    Abstract: A system can train an artificial intelligence risk model to produce a trained model, wherein labeled training data for the training comprises respective features of users and products, and corresponding labels of respective support costs, wherein the trained model comprises a causal tree model that is configured to differentiate between first features that are immutable to an entity that utilizes the trained model and second features that are mutable to the entity. The system can, in response to applying a input to the trained model, wherein the input comprises a feature of a user and a product, produce an output that indicates a predicted support cost that corresponds to the input.
    Type: Application
    Filed: April 29, 2022
    Publication date: November 2, 2023
    Inventors: Ofir Ezrielev, Noga Gershon, Amihai Savir
  • Publication number: 20230352121
    Abstract: One example method includes encoding data as a polysaccharide structure, synthesizing the polysaccharide structure to create polysaccharide storage media that comprises the data, and storing the polysaccharide storage media. The example method may also include compressing the polysaccharide and storing the compressed data as a polysaccharide.
    Type: Application
    Filed: April 27, 2022
    Publication date: November 2, 2023
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Publication number: 20230342631
    Abstract: Methods and systems for managing data collection are disclosed. To manage data collection, a system may include a data aggregator and a data collector. The data aggregator may utilize complex inference models to predict the future operation of the data collector, while the data collector may host simpler inference models. The data collector may access inferences from the complex models by obtaining a difference between complex and simple inferences from the data aggregator and locally reconstructing the complex differences. To reduce data transmission, the data collector may transmit a data difference (e.g., a reduced-size representation of a measurement) to the data aggregator using the reconstructed complex inferences. The data aggregator may reconstruct data from the data collectors using the data difference from the data collector and inferences from the complex inference model.
    Type: Application
    Filed: April 21, 2022
    Publication date: October 26, 2023
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Publication number: 20230342216
    Abstract: Methods and systems for managing generalization of inference models throughout a distributed environment are disclosed. To manage generalization of inference models, a system may include a data aggregator and one or more data collectors. The data aggregator may obtain a similarity graph in order to determine the relationship between data obtained by one or more data collectors. The similarity graph may be used to obtain grouping for the data collectors. The data aggregator may train inference models to facilitate data collection by the data collectors included in the grouping.
    Type: Application
    Filed: April 21, 2022
    Publication date: October 26, 2023
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Publication number: 20230342639
    Abstract: Methods and systems for managing data collection are disclosed. To manage data collection, a system may include a data aggregator. The data aggregator may utilize inference models to predict the future operation of data collectors. To validate these inferences, the data aggregator may compare a data statistic (a reduced-size representation of a series of measurements) to a complementary data statistic based on a set of inferences. If the complementary data statistic is determined accurate, the data aggregator may store the inferences as validated data and operate as though it has access to the measurements from the data collector. By doing so, the system may be able to transmit less data, consume less network bandwidth, and consume less energy throughout a distributed system.
    Type: Application
    Filed: April 21, 2022
    Publication date: October 26, 2023
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Publication number: 20230342641
    Abstract: Methods and systems for managing data collection are disclosed. To manage data collection, a system may include a data aggregator and a data collector. The data aggregator and data collector may utilize identical copies of a twin inference model to predict the future operation of the data collector. To minimize data transmission, the data collector may transmit a difference to the data aggregator. The data aggregator may reconstruct data from the data collectors using the difference from the data collector, and an inference generated by the copy of the twin inference model hosted by the data aggregator.
    Type: Application
    Filed: April 21, 2022
    Publication date: October 26, 2023
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Publication number: 20230342429
    Abstract: Methods and systems for managing data collection are disclosed. To manage data collection, a system may include a data aggregator and data collectors. The data aggregator may utilize an inference model to predict the future operation of data collectors, and a pattern selection model to sample data from data collectors at a specific frequency and sequence. The pattern may specify that some data collectors are not to be sampled at various points in time. By doing so, the system may be able to transmit less data, consume less network bandwidth, and consume less energy throughout a distributed system while still providing access to aggregated data.
    Type: Application
    Filed: April 21, 2022
    Publication date: October 26, 2023
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Publication number: 20230342638
    Abstract: Methods and systems for managing data collection in a distributed system are disclosed. To manage data collection, the system may include a data aggregator and a data collector. The data aggregator and may utilize an inference model to predict data based on future measurements performed by data collectors throughout a distributed system without having access to the measurements. The data collectors may be mobile, and the data aggregator may direct the data collectors to various locations. To select paths for the data collectors to follow, the aggregator may utilize the level of uncertainty in predictions, the sensitivities in ranges of data to downstream consumers of the data collected by the data collectors, and/or other types of information. The data aggregator may select the paths for varying goals over time.
    Type: Application
    Filed: April 21, 2022
    Publication date: October 26, 2023
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Publication number: 20230344779
    Abstract: Methods and systems for managing data collection throughout a distributed environment are disclosed. To manage data collection, a system may include a data aggregator and a data collector. The data collector may utilize a consensus sequence to generate reduced-size data transmissions. The consensus sequence may be made up of patterns of data that occur frequently in data collected by the data collector. Therefore, data collected by the data collector may be condensed by replacing segments of the data with pointer pairs, pointer pairs being indicators of a portion of the consensus sequence that matches a segments of data. The data collector may transmit these pointer pairs, along with any additional segments of data, to the data aggregator instead of transmitting full data sets. The data aggregator may reconstruct data from the data collectors using the reduced-size data and the consensus sequence.
    Type: Application
    Filed: April 21, 2022
    Publication date: October 26, 2023
    Inventors: Ofir Ezrielev, Jehuda Shemer
  • Publication number: 20230325354
    Abstract: Compressing files is disclosed. An input file to be compressed is first aligned. During or prior to aligning the input file, hyperparameters are set, determined, or configured. The hyperparameters may be set, determined, or configured to achieve a particular performance characteristic. Aligning the file includes splitting the file into sequences that can be aligned. The result is a compression matrix, where each row of the matrix corresponds to part of the file. A consensus sequence id determined from the compression matrix. Using the consensus sequence, pointer pairs are generated. Each pointer pair identifies a subsequence of the consensus matrix. The compressed file includes the pointer pairs and the consensus sequence.
    Type: Application
    Filed: April 12, 2022
    Publication date: October 12, 2023
    Inventors: Ofir Ezrielev, Ilan Buyum, Jehuda Shemer
  • Publication number: 20230325356
    Abstract: Compressing files is disclosed. An input file to be compressed is first aligned. When the file has multiple axes or dimensions, the file is aligned along each of the axes. Aligning the file includes splitting the file into sequences that can be aligned along each of the axes or dimensions. Aligning the file generates a compression tensor, where each row or dimensional space of the compression tensor corresponds to part of the file. A consensus tensor is determined from the compression tensor. Using the consensus tensor, pointer lists are generated. Each pointer lists identifies a subsequence or portion of the consensus tensor. The compressed file includes the pointer lists and the consensus tensor.
    Type: Application
    Filed: April 12, 2022
    Publication date: October 12, 2023
    Inventors: Ofir Ezrielev, Ilan Buyum, Jehuda Shemer
  • Publication number: 20230325355
    Abstract: Compressing files is disclosed. An input file to be compressed is first aligned. Aligning the file includes splitting the file into sequences that can be aligned. The result is a compression matrix, where each row of the matrix corresponds to part of the file. The compression matrix may also serve as a warm start if additional compression is desired. Compression may be performed in stages, where an initial compression matrix is generated in a first stage using larger letter sizes for alignment and then a second compression stage is performed using smaller letter sizes. A consensus sequence id determined from the compression matrix. Using the consensus sequence, pointer pairs are generated. Each pointer pair identifies a subsequence of the consensus matrix. The compressed file includes the pointer pairs and the consensus sequence.
    Type: Application
    Filed: April 12, 2022
    Publication date: October 12, 2023
    Inventors: Ofir Ezrielev, Ilan Buyum, Jehuda Shemer