Patents by Inventor Poorna Kale

Poorna Kale has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220156560
    Abstract: Apparatuses and methods can be related to compiling instructions for implementing an artificial neural network (ANN) bypass. The bypass path can be used to bypass a portion of the ANN such that the ANN generates an output with a particular level of confidence while utilizing less resources than if the portion of the ANN had not been bypassed. A compiler can determine where to place the bypass path in an ANN.
    Type: Application
    Filed: November 18, 2020
    Publication date: May 19, 2022
    Inventors: Saideep Tiku, Poorna Kale
  • Publication number: 20220129677
    Abstract: Systems, devices, and methods related to video analysis using an Artificial Neural Network (ANN)) are described. For example, a data storage device can be configured to perform the computation of an ANN to recognize or classify features captured in the video images. The recognition or classification results of a prior video frame can be used to accelerate the analysis of the next video frame. The ANN can be organized in layers, where the intermediate result of a current layer can be further analyzed by a next layer for improved accuracy and confidence level. Before or while processing using the next layer, the intermediate result can be compared to the results obtained for the prior frame. If, in view of the results of the prior frame, the confidence level of the intermediate result is boosted to above a threshold, the subsequent layer(s) can be skipped or terminated early.
    Type: Application
    Filed: October 22, 2020
    Publication date: April 28, 2022
    Inventor: Poorna Kale
  • Publication number: 20220129738
    Abstract: Systems, devices, and methods related to safety monitoring in a factory using an artificial neural network are described. For example, the system can use a plurality of sensors installed at different locations of a factory to generate a plurality of streams of sensor data. At least one memory device can be configured in the system to perform matrix computations of the artificial neural network according to the plurality of streams of sensor data written into at least one memory device. Based on an output of the artificial neural network responsive to the plurality of streams of sensor data, the system generates an event identification representative of a hazard or anomaly in the factory and activates safety control or notification responsive to the event identification.
    Type: Application
    Filed: October 22, 2020
    Publication date: April 28, 2022
    Inventor: Poorna Kale
  • Publication number: 20220128657
    Abstract: Systems, devices, and methods related to a radar and an artificial neural network are described. For example, the radar can have at least one processing unit configured to execute instructions implementing matrix computation of the artificial neural network. The artificial neural network is configured to identify features in the radar image in an output responsive to an input containing a radar image. Optionally, the radar can further include an image sensor to generate an optical image as part of the input to artificial neural network. Instead of outputting the radar images and/or the optical images, the radar may output a description of the features identified via the artificial neural network from the radar image.
    Type: Application
    Filed: October 22, 2020
    Publication date: April 28, 2022
    Inventor: Poorna Kale
  • Publication number: 20220114843
    Abstract: Systems, methods and apparatuses of predictive maintenance of automotive transmission of vehicles. For example, the transmission has at least one sensor to measure a temperature in transmission fluid, the torque applied on a shaft of the transmission, a vibration sensor, and/or a microphone. During a period in which the vehicle is assumed to be operating normally, the sensor data generated by the transmission sensor(s) is used to train an artificial neural network to recognize the normal patterns in the sensor data. Subsequently, the trained artificial neural network is used to determine whether the current sensor data from the transmission sensor(s) are abnormal. A maintenance alert can be generated for the vehicle in response to a determination that the operations of the transmission are abnormal according to the artificial neural network and the current sensor data.
    Type: Application
    Filed: December 20, 2021
    Publication date: April 14, 2022
    Inventors: Poorna Kale, Robert Richard Noel Bielby
  • Patent number: 11301132
    Abstract: One or more usage parameter values are received from a host system. The one or more parameter values correspond to one or more operations performed at the memory sub-system. Based on the one or more usage parameter values, a first expected time period is determined during which a first set of subsequent host data will be received from the host system and a second expected time period is determined during which a second set of subsequent host data will be received from the host system. A media management operation is scheduled to be performed between the first expected time period and the second expected time period.
    Type: Grant
    Filed: August 30, 2019
    Date of Patent: April 12, 2022
    Assignee: MICRON TECHNOLOGY, INC.
    Inventors: Poorna Kale, Ashok Sahoo
  • Patent number: 11296729
    Abstract: Systems, methods, and apparatus related to memory devices such as solid state drives. In one approach, data is received from a host system (e.g., data to be written to an SSD). The received data is encoded using a first error correction code to generate first parity data. A temperature at which memory cells of a storage device (e.g., the SSD) will store the received data is determined. In response to determining the temperature, a first portion of the received data is identified (e.g., data in memory storage that is error-prone at a predicted higher temperature that has been determined based on output from an artificial neural network using sensor(s) input). The identified first portion is encoded using a second error correction code to generate second parity data. The second error correction code has a higher error correction capability than the first error correction code. The encoded first portion, the first parity data, and the second parity data are stored in the memory cells of the storage device.
    Type: Grant
    Filed: July 23, 2020
    Date of Patent: April 5, 2022
    Assignee: Micron Technology, Inc.
    Inventors: Poorna Kale, Christopher Joseph Bueb
  • Publication number: 20220101119
    Abstract: Apparatuses and methods can be related to implementing age-based network training. An artificial neural network (ANN) can be trained by introducing errors into the ANN. The errors and the quantity of errors introduced into the ANN can be based on age-based characteristics of the memory device.
    Type: Application
    Filed: September 30, 2020
    Publication date: March 31, 2022
    Inventors: Saideep Tiku, Poorna Kale
  • Publication number: 20220070407
    Abstract: Systems, devices, and methods related to Video Surveillance as a Service (VSaaS) are described. For example, a removable storage device, such as a secure digital (SD) memory card or a micro SD card, can be configured to run a virtual camera agent. When the removable storage device is inserted into a digital camera to provide a storage capacity for the digital camera, the agent can convert the video captured by the digital camera into video captured by a virtual camera. The virtual camera can be configured to be in compliance with the camera requirements of a VSaaS platform. Thus, a digital camera not in compliance with the platform can still be used with the platform through the deployment of the virtual camera that is enabled by the removable storage device.
    Type: Application
    Filed: August 25, 2020
    Publication date: March 3, 2022
    Inventors: Poorna Kale, Te-Chang Lin
  • Patent number: 11263156
    Abstract: A memory component can include memory cells with a memory region to store a machine learning model and input data and another memory region to store host data from a host system. The memory component can include an in-memory logic, coupled to the memory cells, to perform a machine learning operation by applying the machine learning model to the input data to generate an output data. A bus can receive additional data from the host system and can provide the additional data to the other memory region or the in-memory logic based on a characteristic of the additional data.
    Type: Grant
    Filed: October 14, 2019
    Date of Patent: March 1, 2022
    Assignee: Micron Technology, Inc.
    Inventors: Poorna Kale, Amit Gattani
  • Patent number: 11250648
    Abstract: Systems, methods and apparatuses of predictive maintenance of automotive transmission of vehicles. For example, the transmission has at least one sensor to measure a temperature in transmission fluid, the torque applied on a shaft of the transmission, a vibration sensor, and/or a microphone. During a period in which the vehicle is assumed to be operating normally, the sensor data generated by the transmission sensor(s) is used to train an artificial neural network to recognize the normal patterns in the sensor data. Subsequently, the trained artificial neural network is used to determine whether the current sensor data from the transmission sensor(s) are abnormal. A maintenance alert can be generated for the vehicle in response to a determination that the operations of the transmission are abnormal according to the artificial neural network and the current sensor data.
    Type: Grant
    Filed: December 18, 2019
    Date of Patent: February 15, 2022
    Assignee: Micron Technology, Inc.
    Inventors: Poorna Kale, Robert Richard Noel Bielby
  • Publication number: 20220044102
    Abstract: Systems, devices, and methods related to a Deep Learning Accelerator and memory are described. For example, an integrated circuit device may be configured to execute instructions with matrix operands and configured with random access memory (RAM) to store parameters of an artificial neural network (ANN). The device can generate random bit errors to simulate compromised or corrupted memory cells in a portion of the RAM accessed during computations of a first ANN output. A second ANN output is generated with the random bit errors applied to the data retrieved from the portion of the RAM. Based on a difference between the first and second ANN outputs, the device may adjust the ANN computation to reduce sensitivity to compromised or corrupted memory cells in the portion of the RAM. For example, the sensitivity reduction may be performed through ANN training using machine learning.
    Type: Application
    Filed: August 6, 2020
    Publication date: February 10, 2022
    Inventor: Poorna Kale
  • Publication number: 20220044108
    Abstract: Systems, devices, and methods related to a Deep Learning Accelerator and memory are described. For example, an integrated circuit device may be configured to execute instructions with matrix operands and configured with random access memory. The random access memory is configured to store an image generated in an imaging apparatus configured to image a portion of a person, parameters of an artificial neural network, and instructions executable by the Deep Learning Accelerator to perform matrix computation to generate an output of the artificial neural network. The output can include a feature identified by the artificial neural network and a diagnosis determined by the artificial neural network to assist or guide the imaging of the portion of the person.
    Type: Application
    Filed: August 6, 2020
    Publication date: February 10, 2022
    Inventor: Poorna Kale
  • Publication number: 20220043502
    Abstract: Systems, devices, and methods related to a Deep Learning Accelerator and memory are described. For example, an integrated circuit device may be configured to execute instructions with matrix operands and configured with random access memory that includes multiple memory groups having independent power modes. The random access memory is configured to store data representative of parameters of an Artificial Neural Network and representative of instructions executable by the Deep Learning Accelerator to perform matrix computation to generate an output of the Artificial Neural Network. During execution of the instructions, a power manager may adjust grouping of memory addresses mapped into the memory groups and adjust power modes of the memory groups to reduce power consumption and to avoid performance impact.
    Type: Application
    Filed: August 6, 2020
    Publication date: February 10, 2022
    Inventor: Poorna Kale
  • Publication number: 20220044101
    Abstract: Systems, devices, and methods related to a Deep Learning Accelerator and memory are described. For example, an integrated circuit device may be configured to execute instructions with matrix operands and configured with random access memory. The random access memory is configured to store input data from a sensor, parameters of a first portion of an Artificial Neural Network (ANN), instructions executable by the Deep Learning Accelerator to perform matrix computation of the first portion of the ANN, and data generated outside of the device according to a second portion of the ANN. The Deep Learning Accelerator may execute the instructions to generate, independent of the data from the second portion of the ANN, a first output based on the input data from the sensor and generate a second output based on a combination of the data from the sensor and the data from the second portion of the ANN.
    Type: Application
    Filed: August 6, 2020
    Publication date: February 10, 2022
    Inventor: Poorna Kale
  • Publication number: 20220044107
    Abstract: Systems, devices, and methods related to a Deep Learning Accelerator and memory are described. For example, an integrated circuit device may be configured to execute instructions with matrix operands and configured with random access memory. The random access memory is configured to store a plurality of inputs from a plurality of sensors respective, parameters of an Artificial Neural Network, and instructions executable by the Deep Learning Accelerator to perform matrix computation to generate outputs of the Artificial Neural Network, including first outputs generated using the sensors separately and a second output generated using a combination of the sensors.
    Type: Application
    Filed: August 6, 2020
    Publication date: February 10, 2022
    Inventor: Poorna Kale
  • Publication number: 20220043696
    Abstract: Systems, devices, and methods related to a Deep Learning Accelerator and memory are described. For example, an integrated circuit device may be configured to execute instructions with matrix operands and configured with random access memory. At least one interface of the integrated circuit device is configured to receive input data from a data source, and to receive, from a server system over a computer network, parameters of a first Artificial Neural Network (ANN) and instructions executable by the Deep Learning Accelerator to perform matrix computation of the first ANN. The Deep Learning Accelerator may execute the instructions to generate an output of the first ANN responsive to the third data; and the at least one interface is configured to transmit the output to the server system over the computer network as an input to a second ANN in the server system.
    Type: Application
    Filed: August 6, 2020
    Publication date: February 10, 2022
    Inventor: Poorna Kale
  • Publication number: 20220036157
    Abstract: Methods, systems, and apparatus related to dynamic distribution of an artificial neural network among multiple processing nodes based on real-time monitoring of a processing load on each node. In one approach, a server acts as an intelligent artificial intelligence (AI) gateway. The server receives data regarding a respective operating status for each of monitored processing devices. The monitored processing devices perform processing for an artificial neural network (ANN). The monitored processing devices each perform processing for a portion of the neurons in the ANN. The portions are distributed in response to monitoring the processing load on each processing device (e.g., to better utilize processing power across all of the processing devices).
    Type: Application
    Filed: July 29, 2020
    Publication date: February 3, 2022
    Inventors: Poorna Kale, Amit Gattani
  • Publication number: 20220036164
    Abstract: Systems, methods and apparatus of integrated image sensing devices. In one example, a system includes an image sensor that generates image data. A memory device is stacked with the image sensor and stores the generated image data. A host interface communicates with a host system. The memory device includes an inference engine to generate inference results using the stored image data as input to an artificial neural network. The inference engine includes a neural network accelerator configured to perform matrix arithmetic computations on the data stored in the memory device. The host interface sends the inference results to the host system for processing.
    Type: Application
    Filed: July 29, 2020
    Publication date: February 3, 2022
    Inventors: Poorna Kale, Amit Gattani
  • Publication number: 20220032932
    Abstract: Systems, methods and apparatus of integrated image sensing devices. In one example, a system includes a sensor that generates data. A memory device stores the generated data, and further stores a first portion of an artificial neural network (ANN). A host interface of the system is configured to communicate with a host system that stores a second portion of the ANN. The memory device can be stacked with the sensor. The memory device includes an inference engine configured to generate inference results using the stored data as input to the first portion of the ANN. The host interface is further configured to send the inference results to the host system for processing by the host system using the second portion of the ANN.
    Type: Application
    Filed: July 29, 2020
    Publication date: February 3, 2022
    Inventors: Poorna Kale, Amit Gattani