Patents by Inventor Poorna Kale

Poorna Kale has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230266880
    Abstract: Methods, systems, and devices for techniques to improve latency for gaming applications are described. The memory system may be configured to operate in a gaming mode that may enable faster load times. In some cases, the gaming mode may enable faster game download from an external server. In some cases, the gaming mode may enable faster transferring of files between volatile storage and non-volatile storage at the memory system. The gaming mode may enable faster read and write operations, and faster switching between one or more gaming applications. The memory system may additionally or alternatively be configured to operate in a non-gaming mode which may improve reliability and retention for other, non-gaming applications. The memory system may switch between the two modes depending on an application being executed by the system.
    Type: Application
    Filed: February 22, 2022
    Publication date: August 24, 2023
    Inventors: Qi Dong, Poorna Kale
  • Patent number: 11734195
    Abstract: A first set of memory access operations is performed at a memory sub-system based on first operation settings that are configured based on a first operating environment of a host system. A detection is made that the host system is operating in a second operating environment that is different from the first operating environment. A level of impact that each operating requirement of a set of operating requirements of the memory sub-system has on a performance of the memory sub-system in view of the second operating environment. A second set of memory access operations is determined based on a respective priority for each operating requirement of the set of operating requirements. A second set of memory access operations is performed at the memory sub-system based on the second set of memory access operation settings.
    Type: Grant
    Filed: December 5, 2022
    Date of Patent: August 22, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Christopher Bueb, Poorna Kale
  • Patent number: 11733763
    Abstract: Systems, devices, and methods related to a Deep Learning Accelerator and memory are described. For example, an integrated circuit device may be configured to execute instructions with matrix operands and configured with random access memory that includes multiple memory groups having independent power modes. The random access memory is configured to store data representative of parameters of an Artificial Neural Network and representative of instructions executable by the Deep Learning Accelerator to perform matrix computation to generate an output of the Artificial Neural Network. During execution of the instructions, a power manager may adjust grouping of memory addresses mapped into the memory groups and adjust power modes of the memory groups to reduce power consumption and to avoid performance impact.
    Type: Grant
    Filed: August 6, 2020
    Date of Patent: August 22, 2023
    Assignee: Micron Technology, Inc.
    Inventor: Poorna Kale
  • Patent number: 11729318
    Abstract: A memory device can be configured to direct communication of data from an image sensor to the memory device and/or image signal processing circuitry coupled thereto. The memory device can be configured to receive first signaling indicative of first data from an image sensor via a first port and provide the first signaling from the memory device to image signal processing (ISP) circuitry via a second port. The memory device can be configured to receiving, by the memory device, second signaling indicative of second data from the image sensor while the ISP circuitry operates on the first data. An image processing operation can be performed using logic circuitry of the memory device. Directing communication of data using the memory device can reduce data transfers, reduce resource consumption of an imaging system, and offload workloads from a host device and/or a host processing device, for example.
    Type: Grant
    Filed: November 18, 2020
    Date of Patent: August 15, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Poorna Kale, Richard C. Murphy, Amit Gattani
  • Patent number: 11726784
    Abstract: Systems, devices, and methods related to a Deep Learning Accelerator and memory are described. An edge server may be configured on a local area network to receive sensor data of a person, such as a patient in a hospital or care center. The edge server may be implemented using an integrated circuit device having: a Deep Learning Accelerator configured to execute instructions with matrix operands; and random access memory configured to store first instructions of an Artificial Neural Network executable by the Deep Learning Accelerator and second instructions of a server application executable by a Central Processing Unit. An output of the Artificial Neural Network with the sensor data as input may identify a condition of the person, based on which the server application generates an alert, causing a central server to request intervention of the detected or predicted condition for the person.
    Type: Grant
    Filed: April 9, 2020
    Date of Patent: August 15, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Poorna Kale, Jaime Cummins
  • Publication number: 20230253024
    Abstract: Methods, systems, and devices for techniques for memory system refresh are described. In some cases, a memory system may prioritize refreshing blocks of memory cells containing control information for the file system of the memory system. For example, the memory system may identify a block of memory cells containing control information and adjust an error threshold for refreshing the blocks of memory cells to be lower than an error threshold for refreshing the blocks of memory cells containing data other than control information. Additionally or alternatively, the memory system may perform a refresh control operation for the block of memory cells with a higher frequency (e.g., more frequently) than for other blocks of memory cells.
    Type: Application
    Filed: February 9, 2022
    Publication date: August 10, 2023
    Inventors: Qi Dong, Poorna Kale
  • Patent number: 11720268
    Abstract: A system can include a memory device with an array of memory cells and a machine learning operation component. The machine learning operation component can perform a machine learning computation in association with the array of memory cells. The system can also include a processing device that is operatively coupled with the memory device to perform operations that include setting the memory device to a first mode based on a first mode setting signal received from a host system, where in the first mode, the processing device exposes the array of memory cells to the host system and routes input data from the host system to the array of memory cells. The operations can also include, setting the memory device to a second mode, where in the second mode, the processing device exposes the machine learning operation component to the host system.
    Type: Grant
    Filed: August 2, 2022
    Date of Patent: August 8, 2023
    Assignee: Micron Technology, Inc.
    Inventor: Poorna Kale
  • Patent number: 11720417
    Abstract: Systems, devices, and methods related to a Deep Learning Accelerator and memory are described. For example, an integrated circuit device may be configured to execute instructions with matrix operands and configured with random access memory. At least one interface of the integrated circuit device is configured to receive input data from a data source, and to receive, from a server system over a computer network, parameters of a first Artificial Neural Network (ANN) and instructions executable by the Deep Learning Accelerator to perform matrix computation of the first ANN. The Deep Learning Accelerator may execute the instructions to generate an output of the first ANN responsive to the third data; and the at least one interface is configured to transmit the output to the server system over the computer network as an input to a second ANN in the server system.
    Type: Grant
    Filed: August 6, 2020
    Date of Patent: August 8, 2023
    Assignee: Micron Technology, Inc.
    Inventor: Poorna Kale
  • Patent number: 11711488
    Abstract: A memory system having multiple address tables to translate logical addresses to physical addresses at different granularity levels is disclosed. For example, a first address table is associated with a first block size of translating logical addresses for accessing system files and application files; and a second address table is associated with a second block size of translating logical addresses for storing and/or retrieving data from an image sensor of a surveillance camera. A user interface can be used to access a configuration option to specify the second block size; and a user may indicate a typical size of an image or video file to be recorded by the surveillance camera to calculate the second block size and thus configure the second address table for a partition to record the image or video files.
    Type: Grant
    Filed: August 31, 2021
    Date of Patent: July 25, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Poorna Kale, Christopher Joseph Bueb, Te-Chang Lin, Qi Dong
  • Patent number: 11709625
    Abstract: Systems, methods and apparatuses to control power usage of a data storage device. For example, the data storage device has a temperature sensor configured to measure the temperature of the data storage device are provided. A controller of the data storage device determines a set of operating parameters that identify an operating condition of the data storage device. An inference engine of the data storage device determines, using an artificial neural network in the data storage device and based on the set of operating parameters, an operation schedule for a period of time of processing input and output of the data storage device. The operation schedule is configured to optimize a performance of the data storage device in the period of time without the temperature of the data storage device going above a threshold.
    Type: Grant
    Filed: February 14, 2020
    Date of Patent: July 25, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Poorna Kale, Robert Richard Noel Bielby
  • Patent number: 11704599
    Abstract: A system including a machine learning processing device and a memory device with microbumps is disclosed. A machine learning processing device is for performing a machine learning operation, where the machine learning processing device includes a first set of microbumps. A memory device is for storing data for the machine learning operation, where the memory device includes a second set of microbumps. The first set of microbumps of the memory device are coupled with the second set of microbumps of the machine learning processing device. The first set of microbumps of the memory device and the second set of microbumps of the machine learning processing device are to transmit the data for the machine learning operation.
    Type: Grant
    Filed: December 4, 2019
    Date of Patent: July 18, 2023
    Assignee: Micron Technology, Inc.
    Inventor: Poorna Kale
  • Patent number: 11702086
    Abstract: Systems, methods and apparatus of recordation of vehicle data associated with errant vehicle behavior. For example, a vehicle includes: sensors configured to generate sensor data; control elements configured to generate control signals to be applied to the vehicle in response to user interactions with the control elements; electronic control units configured to provide status data in operations of the electronic control units; and a data storage device. The data storage device is configured to receive input data including the sensor data, the control signals and the status data, store the input data in a cyclic way in an input partition over time, generate a classification of errant behavior based on the input data and using an artificial neural network, and preserve a portion of the input data associated with the classification of errant behavior.
    Type: Grant
    Filed: August 21, 2019
    Date of Patent: July 18, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Robert Richard Noel Bielby, Poorna Kale
  • Patent number: 11702001
    Abstract: Systems, methods and apparatuses to detect an item left in a vehicle and to generate an alert about the item. For example, a camera configured in a vehicle can be used to monitor an item associated with a user of the vehicle. The item as in an image from the camera can be identified and recognized using an artificial neural network. In response to a determination that the item recognized in the image is left in the vehicle after the user has exited the vehicle, an alert is generated to indicate that an item is in the vehicle but the user is leaving the vehicle.
    Type: Grant
    Filed: July 14, 2021
    Date of Patent: July 18, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Poorna Kale, Robert Richard Noel Bielby
  • Patent number: 11693562
    Abstract: Systems, methods and apparatus of intelligent bandwidth allocation to different types of operations to access storage media in a data storage device. For example, a data storage device of a vehicle includes: storage media components; a controller configured to store data into and retrieve data from the storage media components according to commands received in the data storage device; and an artificial neural network configured to receive, as input and as a function of time, operating parameters indicative a data access pattern, and generate, based on the input, a prediction to determine an optimized bandwidth allocation scheme for controlling access by different types of operations in the data storage device to the storage media components. The controller is configured to schedule the operations of the different types to access the one or more storage media components according to the optimized bandwidth allocation scheme.
    Type: Grant
    Filed: September 5, 2019
    Date of Patent: July 4, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Poorna Kale, Robert Richard Noel Bielby
  • Patent number: 11694076
    Abstract: A memory component can include memory cells where a first region of the memory cells is to store a machine learning model and a second region of the memory cells is to store input data and output data of a machine learning operation. A controller can be coupled to the memory component with one more internal buses to perform the machine learning operation by applying the machine learning model to the input data to generate the output data.
    Type: Grant
    Filed: October 14, 2019
    Date of Patent: July 4, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Amit Gattani, Poorna Kale
  • Publication number: 20230208815
    Abstract: Methods, systems, and devices for security configurations for zonal computing architecture are described. A zonal computing system in a vehicle may be associated with multiple zones. The zonal computing system may include devices (e.g., sensors, actuators) that interact with the vehicle or an environment associated with the vehicle. A memory system included in the zonal computing system may authenticate whether a device associated with a zone is a trusted device and enable or restrict communications with the device based on the authentication. For example, the zonal computing system may include a central processor that communicates with a remote server and the multiple zones and may include a gateway processor coupled with the central processor and the device and associated with the zone. Based on whether the device is trusted, the memory system may enable or restrict communications between the central processer and the device and routed through the gateway processor.
    Type: Application
    Filed: December 29, 2021
    Publication date: June 29, 2023
    Inventors: Poorna Kale, Robert Noel Bielby
  • Patent number: 11681903
    Abstract: Systems, methods and apparatus of implementing spiking neural networks. For example, an integrated circuit includes a crossbar array of first memristors connected between wordlines and bitlines. The first memristors are configured to convert voltages applied on the wordlines into currents in the bitlines. Second memristors having thresholds are connected to the bitlines respectively. Each respective memristor in the second memristors can reduce its resistance to cause spiking in a current flowing through the respective memristor, when the current reaches the threshold of the respective memristor. Current level detectors are connected to the second memristors to determine whether the currents in the bitlines have levels corresponding to reaching thresholds of the second memristors and thus, generate output spikes of spiking neurons without using analog-to-digital converters to measure the currents in the bitlines.
    Type: Grant
    Filed: October 31, 2019
    Date of Patent: June 20, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Anakha Vasanthakumaribabu, Poorna Kale
  • Patent number: 11681909
    Abstract: Memory cells can include a memory region to store a machine learning model and input data and another memory region to store host data from a host system. An in-memory logic can be coupled to the plurality of memory cells and can perform a machine learning operation by applying the machine learning model to the input data to generate an output data. A bus can receive additional host data from the host system and can provide the additional host data to the memory component for the other memory region of the plurality of memory cells. An additional bus can receive machine learning data from the host system and can provide the machine learning data to the memory component for the in-memory logic that is to perform the machine learning operation.
    Type: Grant
    Filed: October 14, 2019
    Date of Patent: June 20, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Poorna Kale, Amit Gattani
  • Patent number: 11675498
    Abstract: One or more usage parameter values associated with a host system are obtained. The one or more parameter values correspond to one or more operations associated with a memory sub-system. An expected time period during which a set of host data will be received from the host system is determined in view of the one or more usage parameter values. In response to a determination, in view of an indication received from the host system, that the set of host data will not be received at the expected time period, a media management operation is performed at memory units of the memory sub-system.
    Type: Grant
    Filed: March 8, 2022
    Date of Patent: June 13, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Poorna Kale, Ashok Sahoo
  • Patent number: 11676010
    Abstract: A system includes a memory component to store host data from a host system and to store a machine learning model and input data. A controller includes an in-memory logic to perform a machine learning operation by applying the machine learning model to the input data to generate an output data. A bus can receive additional host data from the host system and provide the additional host data to the memory component. An additional bus can receive machine learning data from the host system and provide the machine learning data to the in-memory logic that is to perform the machine learning operation.
    Type: Grant
    Filed: October 14, 2019
    Date of Patent: June 13, 2023
    Assignee: Micron Technology, Inc.
    Inventors: Amit Gattani, Poorna Kale