Patents by Inventor Sathish Terakanambi Sheshadri

Sathish Terakanambi Sheshadri has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11586922
    Abstract: A method for selectively dropping out feature elements from a tensor in a neural network includes receiving a first tensor from a first layer of a neural network and obtaining a compressed mask for the first tensor. N mask bits of the compressed mask are received at each of N lanes of a reconfigurable computing unit and feature elements of the first tensor are respectively received at the N lanes. Feature elements are selectively dropped out from the first tensor to generate feature elements to use as at least part of a second tensor by selecting, based on a single mask bit of the compressed mask selected based on the lane, either a zero value or a feature element received at the lane for a feature element of the second tensor. The second tensor is propagated to a second layer of the neural network.
    Type: Grant
    Filed: February 4, 2022
    Date of Patent: February 21, 2023
    Assignee: SambaNova Systems, Inc.
    Inventors: Sathish Terakanambi Sheshadri, Ram Sivaramakrishnan, Raghu Prabhakar
  • Patent number: 11580397
    Abstract: A method for selectively dropping out feature elements from a tensor in a neural network is disclosed. The method includes receiving a first tensor from a first layer of a neural network. The first tensor includes multiple feature elements arranged in a first order. A compressed mask for the first tensor is obtained. The compressed mask includes single-bit mask elements respectively corresponding to the multiple feature elements of the first tensor and has a second order that is different than the first order of their corresponding feature elements in the first tensor. Feature elements from the first tensor are selectively dropped out based on the compressed mask to form a second tensor which is propagated to a second layer of the neural network.
    Type: Grant
    Filed: February 4, 2022
    Date of Patent: February 14, 2023
    Assignee: SambaNova Systems, Inc.
    Inventors: Sathish Terakanambi Sheshadri, Ram Sivaramakrishnan, Raghu Prabhakar
  • Publication number: 20220391694
    Abstract: A method for selectively dropping out feature elements from a tensor in a neural network is disclosed. The method includes receiving a first tensor from a first layer of a neural network. The first tensor includes multiple feature elements arranged in a first order. A compressed mask for the first tensor is obtained. The compressed mask includes single-bit mask elements respectively corresponding to the multiple feature elements of the first tensor and has a second order that is different than the first order of their corresponding feature elements in the first tensor. Feature elements from the first tensor are selectively dropped out based on the compressed mask to form a second tensor which is propagated to a second layer of the neural network.
    Type: Application
    Filed: February 4, 2022
    Publication date: December 8, 2022
    Applicant: SambaNova Systems, Inc.
    Inventors: Sathish TERAKANAMBI SHESHADRI, Ram SIVARAMAKRISHNAN, Raghu PRABHAKAR
  • Publication number: 20220391695
    Abstract: A method for selectively dropping out feature elements from a tensor in a neural network includes receiving a first tensor from a first layer of a neural network and obtaining a compressed mask for the first tensor. N mask bits of the compressed mask are received at each of N lanes of a reconfigurable computing unit and feature elements of the first tensor are respectively received at the N lanes. Feature elements are selectively dropped out from the first tensor to generate feature elements to use as at least part of a second tensor by selecting, based on a single mask bit of the compressed mask selected based on the lane, either a zero value or a feature element received at the lane for a feature element of the second tensor. The second tensor is propagated to a second layer of the neural network.
    Type: Application
    Filed: February 4, 2022
    Publication date: December 8, 2022
    Applicant: SambaNova Systems, Inc.
    Inventors: Sathish TERAKANAMBI SHESHADRI, Ram SIVARAMAKRISHNAN, Raghu PRABHAKAR
  • Patent number: 11328209
    Abstract: A method for selectively dropping out feature elements from a tensor is disclosed. The method includes generating a mask that has a plurality of mask elements. Each mask element includes a corresponding plurality of bits representing either a first value or a second value, to indicate whether a corresponding feature element of the tensor output by a neural network layer is to be dropped out or retained. Each mask element of the plurality of mask elements of the mask is compressed to generate a corresponding compressed mask element of a plurality of compressed mask elements of a compressed mask, thereby generating the compressed mask from the mask. Each compressed mask element of the plurality of compressed mask elements includes a corresponding single bit. Feature elements are selectively dropped from the tensor, based on the compressed mask.
    Type: Grant
    Filed: June 2, 2021
    Date of Patent: May 10, 2022
    Assignee: SambaNova Systems, Inc.
    Inventors: Sathish Terakanambi Sheshadri, Ram Sivaramakrishnan, Raghu Prabhakar
  • Patent number: 11256987
    Abstract: A method for selectively dropping out feature elements from a tensor is disclosed. The method includes generating a mask that has a plurality of mask elements arranged in a first order. A compressed mask is generated, which includes a plurality of compressed mask elements arranged in a second order that is different from the first order. For example, each mask element of the plurality of mask elements of the mask is compressed to generate a corresponding compressed mask element of the plurality of compressed mask elements of the compressed mask. Individual compressed mask element of the plurality of compressed mask elements is indicative of whether a corresponding feature element of the tensor output by a neural network layer is to be dropped out or retained. Feature elements are selectively dropped from the tensor, based on the compressed mask.
    Type: Grant
    Filed: June 2, 2021
    Date of Patent: February 22, 2022
    Assignee: SambaNova Systems, Inc.
    Inventors: Sathish Terakanambi Sheshadri, Ram Sivaramakrishnan, Raghu Prabhakar