Patents by Inventor Milad Olia Hashemi

Milad Olia Hashemi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240135492
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for processing an input image using a super-resolution neural network to generate an up-sampled image that is a higher resolution version of the input image. In one aspect, a method comprises: processing the input image using an encoder subnetwork of the super-resolution neural network to generate a feature map; generating an updated feature map, comprising, for each spatial position in the updated feature map: applying a convolutional filter to the feature map to generate a plurality of features corresponding to the spatial position in the updated feature map, wherein the convolutional filter is parametrized by a set of convolutional filter parameters that are generated by processing data representing the spatial position using a hyper neural network; and processing the updated feature map using a projection subnetwork of the super-resolution neural network to generate the up-sampled image.
    Type: Application
    Filed: October 12, 2023
    Publication date: April 25, 2024
    Inventors: Cristina Nader Vasconcelos, Ahmet Cengiz Oztireli, Andrea Tagliasacchi, Kevin Jordan Swersky, Mark Jeffrey Matthews, Milad Olia Hashemi
  • Publication number: 20240086281
    Abstract: Methods, systems and apparatus, including computer programs encoded on computer storage medium, for predicting a likelihood of a future computer memory failure. In one aspect training data inputs are obtained, where each training data input includes correctable memory error data that describes correctable errors that occurred in a computer memory and data indicating whether the correctable errors produced a failure of the computer memory. For each training data input, image representations of the correctable memory error data included in the training data input are generated. The image representations are processed using a machine learning model to output an estimated likelihood of a future failure of the computer memory. A difference between the estimated likelihood of the future failure of the computer memory and the data indicating whether the correctable errors produced a failure of the computer memory is computed. Values of model parameters are updated using the computed difference.
    Type: Application
    Filed: November 13, 2023
    Publication date: March 14, 2024
    Inventors: Gufeng Zhang, Milad Olia Hashemi, Ashish V. Naik
  • Patent number: 11853161
    Abstract: Methods, systems and apparatus, including computer programs encoded on computer storage medium, for predicting a likelihood of a future computer memory failure. In one aspect training data inputs are obtained, where each training data input includes correctable memory error data that describes correctable errors that occurred in a computer memory and data indicating whether the correctable errors produced a failure of the computer memory. For each training data input, image representations of the correctable memory error data included in the training data input are generated. The image representations are processed using a machine learning model to output an estimated likelihood of a future failure of the computer memory. A difference between the estimated likelihood of the future failure of the computer memory and the data indicating whether the correctable errors produced a failure of the computer memory is computed. Values of model parameters are updated using the computed difference.
    Type: Grant
    Filed: April 22, 2022
    Date of Patent: December 26, 2023
    Assignee: Google LLC
    Inventors: Gufeng Zhang, Milad Olia Hashemi, Ashish V. Naik
  • Publication number: 20230342245
    Abstract: Methods, systems and apparatus, including computer programs encoded on computer storage medium, for predicting a likelihood of a future computer memory failure. In one aspect training data inputs are obtained, where each training data input includes correctable memory error data that describes correctable errors that occurred in a computer memory and data indicating whether the correctable errors produced a failure of the computer memory. For each training data input, image representations of the correctable memory error data included in the training data input are generated. The image representations are processed using a machine learning model to output an estimated likelihood of a future failure of the computer memory. A difference between the estimated likelihood of the future failure of the computer memory and the data indicating whether the correctable errors produced a failure of the computer memory is computed. Values of model parameters are updated using the computed difference.
    Type: Application
    Filed: April 22, 2022
    Publication date: October 26, 2023
    Inventors: Gufeng Zhang, Milad Olia Hashemi, Ashish V. Naik
  • Publication number: 20230033000
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, relating to multi-task recurrent neural networks. One of the methods includes maintaining data specifying, for a recurrent neural network, a separate internal state for each of a plurality of memory regions; receiving a current input; identifying a particular memory region of the memory access address defined by the current input; selecting, from the internal states specified in the maintained data, the internal state for the particular memory region; processing, in accordance with the selected internal state for the particular memory region, the current input in the sequence of inputs using the recurrent neural network to: generate an output, the output defining a probability distribution of a predicted memory access address, and update the selected internal state of the particular memory region; and associating the updated selected internal state with the particular memory region in the maintained data.
    Type: Application
    Filed: August 15, 2022
    Publication date: February 2, 2023
    Inventors: Milad Olia Hashemi, Jamie Alexander Smith, Kevin Jordan Swersky
  • Patent number: 11416733
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, relating to multi-task recurrent neural networks. One of the methods includes maintaining data specifying, for a recurrent neural network, a separate internal state for each of a plurality of memory regions; receiving a current input; identifying a particular memory region of the memory access address defined by the current input; selecting, from the internal states specified in the maintained data, the internal state for the particular memory region; processing, in accordance with the selected internal state for the particular memory region, the current input in the sequence of inputs using the recurrent neural network to: generate an output, the output defining a probability distribution of a predicted memory access address, and update the selected internal state of the particular memory region; and associating the updated selected internal state with the particular memory region in the maintained data.
    Type: Grant
    Filed: January 30, 2019
    Date of Patent: August 16, 2022
    Assignee: Google LLC
    Inventors: Milad Olia Hashemi, Jamie Alexander Smith, Kevin Jordan Swersky
  • Patent number: 11275744
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for disaggregating latent causes for computer system optimization. In one aspect, a method includes accessing a data stream for data values resulting from operations performed by a computer system; providing the data values as input to a data disaggregation machine learning model that generates descriptors of latent causes of the data values; providing the data values and the descriptors of the latent causes of the data values as inputs to a control system model that generates embedded representations of commands to modify the operations performed by the computer system; determining commands to modify the operations performed by the computer system based on the embedded representations of commands to modify the operations performed by the computer system; and providing the commands to the computer system.
    Type: Grant
    Filed: April 6, 2020
    Date of Patent: March 15, 2022
    Assignee: Google LLC
    Inventors: Milad Olia Hashemi, Parthasarathy Ranganathan, Harsh Satija
  • Publication number: 20200233871
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for disaggregating latent causes for computer system optimization. In one aspect, a method includes accessing a data stream for data values resulting from operations performed by a computer system; providing the data values as input to a data disaggregation machine learning model that generates descriptors of latent causes of the data values; providing the data values and the descriptors of the latent causes of the data values as inputs to a control system model that generates embedded representations of commands to modify the operations performed by the computer system; determining commands to modify the operations performed by the computer system based on the embedded representations of commands to modify the operations performed by the computer system; and providing the commands to the computer system.
    Type: Application
    Filed: April 6, 2020
    Publication date: July 23, 2020
    Inventors: Milad Olia Hashemi, Parthasarathy Ranganathan, Harsh Satija
  • Publication number: 20200160150
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, relating to multi-task recurrent neural networks. One of the methods includes maintaining data specifying, for a recurrent neural network, a separate internal state for each of a plurality of memory regions; receiving a current input; identifying a particular memory region of the memory access address defined by the current input; selecting, from the internal states specified in the maintained data, the internal state for the particular memory region; processing, in accordance with the selected internal state for the particular memory region, the current input in the sequence of inputs using the recurrent neural network to: generate an output, the output defining a probability distribution of a predicted memory access address, and update the selected internal state of the particular memory region; and associating the updated selected internal state with the particular memory region in the maintained data.
    Type: Application
    Filed: January 30, 2019
    Publication date: May 21, 2020
    Inventors: Milad Olia Hashemi, Jamie Alexander Smith, Kevin Jordan Swersky
  • Patent number: 10650001
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for disaggregating latent causes for computer system optimization. In one aspect, a method includes accessing a data stream for data values resulting from operations performed by a computer system; providing the data values as input to a data disaggregation machine learning model that generates descriptors of latent causes of the data values; providing the data values and the descriptors of the latent causes of the data values as inputs to a control system model that generates embedded representations of commands to modify the operations performed by the computer system; determining commands to modify the operations performed by the computer system based on the embedded representations of commands to modify the operations performed by the computer system; and providing the commands to the computer system.
    Type: Grant
    Filed: October 5, 2017
    Date of Patent: May 12, 2020
    Assignee: Google LLC
    Inventors: Milad Olia Hashemi, Parthasarathy Ranganathan, Harsh Satija
  • Publication number: 20190370632
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for pre-fetching data from memory using neural networks. One example system receives a sequence of prior program counter addresses of a computer program and corresponding delta values. The system creates an input representation based on the sequence. The system provides the input representation as input to a recurrent neural network. The system receives from the recurrent neural network an output that defines a probability distribution over future delta values. Each probability in the distribution represents a likelihood that execution of a future instruction of the computer program will cause data to be fetched from a particular future memory address.
    Type: Application
    Filed: May 31, 2018
    Publication date: December 5, 2019
    Inventors: Milad Olia Hashemi, Parthasarathy Ranganathan
  • Publication number: 20190108261
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for disaggregating latent causes for computer system optimization. In one aspect, a method includes accessing a data stream for data values resulting from operations performed by a computer system; providing the data values as input to a data disaggregation machine learning model that generates descriptors of latent causes of the data values; providing the data values and the descriptors of the latent causes of the data values as inputs to a control system model that generates embedded representations of commands to modify the operations performed by the computer system; determining commands to modify the operations performed by the computer system based on the embedded representations of commands to modify the operations performed by the computer system; and providing the commands to the computer system.
    Type: Application
    Filed: October 5, 2017
    Publication date: April 11, 2019
    Inventors: Milad Olia Hashemi, Parthasarathy Ranganathan, Harsh Satija