Patents by Inventor Satyam Sheshansh

Satyam Sheshansh has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240152823
    Abstract: A method comprises receiving logistics operation order data, wherein the logistics operation order data identifies at least one logistics operation to be performed. The logistics operation order data is analyzed using one or more machine learning algorithms. Based at least in part on the analyzing, a logistics provider to perform the at least one logistics operation is predicted.
    Type: Application
    Filed: November 4, 2022
    Publication date: May 9, 2024
    Inventors: Bijan Kumar Mohanty, Satyam Sheshansh, Hung Dinh, Balaji Singh
  • Patent number: 11941450
    Abstract: A system and method place an incoming workload within a data center having infrastructure elements (IEs) for execution. Instrumentation data are collected for both individual IEs in the data center, and workload instances executing on each of these IEs. These data are used to train a future load model according to machine learning techniques, especially supervised learning. Future loads, in turn, are used to train a ranking model that ranks IEs according to suitability to execute additional workloads. After receiving an incoming workload, the first model is used to predict, for each IE, the load on its computing resources if the workload were executed on that IE. The resulting predicted loads are then fed into the second model to predict the best ranking of IEs, and the workload is placed on the highest-ranked IE that is available to execute the workload.
    Type: Grant
    Filed: April 27, 2021
    Date of Patent: March 26, 2024
    Assignee: Dell Products L.P.
    Inventors: Rômulo Teixeira De Abreu Pinho, Satyam Sheshansh, Hung Dinh, Bijan Mohanty
  • Publication number: 20230368130
    Abstract: In one aspect, an example methodology implementing the disclosed techniques includes, by an order prioritization service, receiving information regarding orders that need to be fulfilled and determining, for each one of the orders, one or more relevant features from the information regarding the order, the one or more relevant features influencing prediction of an order priority. The method also includes, by the order prioritization service, predicting, using a machine learning (ML) model, a priority score for each one of the orders based on the determined one or more relevant features, and ranking the orders based on their respective priority scores.
    Type: Application
    Filed: May 16, 2022
    Publication date: November 16, 2023
    Applicant: Dell Products L.P.
    Inventors: Satyam Sheshansh, Dhiraj Balakrishnan, Balaji Singh Mahendranath Singh, Durga Ram Singh Bondili
  • Publication number: 20230119396
    Abstract: An example methodology implementing the disclosed techniques includes receiving a parts configuration specified for a product and generating a first feature vector that represents features from the product. The method also includes predicting, using a trained quote-time issue prediction module, whether the parts configuration specified for the product will or will not result in issues based on the first feature vector and, responsive to a prediction that the parts configuration specified for the product will not result in issues, accepting an order for the product. The method may further include receiving manufacturing details selected for the product, generating a second feature vector that represents features from the product and the selected manufacturing details, and predicting, using a trained manufacture-time issue prediction module, whether producing the product in accordance with the selected manufacturing details will or will not result in issues based on the second feature vector.
    Type: Application
    Filed: October 19, 2021
    Publication date: April 20, 2023
    Applicant: Dell Products L.P.
    Inventors: Bijan Kumar Mohanty, Satyam Sheshansh, Hung Dinh, Durga Ram Singh Bondili
  • Publication number: 20220391832
    Abstract: In one aspect, an example methodology implementing the disclosed techniques includes receiving a corpus of historical order fulfillment data regarding a plurality of completed orders for one or more products, the historical order fulfillment data including an actual delivery time for each product in a completed order, and identifying, from the corpus of historical order fulfillment data, a plurality of features for a product, the plurality of features correlated with an actual delivery time for the product. The method also includes generating a training dataset using the identified plurality of features, the training dataset including a plurality of training samples, each training sample of the plurality of training samples corresponding to a product and including one or more identified features and the actual delivery time for the product. The method may include training the delivery time prediction module using the plurality of training samples.
    Type: Application
    Filed: July 23, 2021
    Publication date: December 8, 2022
    Applicant: Dell Products L.P.
    Inventors: Bijan Mohanty, Hung Dinh, Satyam Sheshansh, Durga Ram Singh Bondili
  • Publication number: 20220342704
    Abstract: A system and method place an incoming workload within a data center having infrastructure elements (IEs) for execution. Instrumentation data are collected for both individual IEs in the data center, and workload instances executing on each of these IEs. These data are used to train a future load model according to machine learning techniques, especially supervised learning. Future loads, in turn, are used to train a ranking model that ranks IEs according to suitability to execute additional workloads. After receiving an incoming workload, the first model is used to predict, for each IE, the load on its computing resources if the workload were executed on that IE. The resulting predicted loads are then fed into the second model to predict the best ranking of IEs, and the workload is placed on the highest-ranked IE that is available to execute the workload.
    Type: Application
    Filed: April 27, 2021
    Publication date: October 27, 2022
    Applicant: Dell Products L.P.
    Inventors: Rômulo Teixeira De Abreu Pinho, Satyam Sheshansh, Hung Dinh, Bijan Mohanty