Patents by Inventor Vamsi Krishna
Vamsi Krishna has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12293081Abstract: The present disclosure relates to field of Dual In-Line Memory Modules that discloses method and system for generating memory maps. The method comprises detecting, by computing system, at least one of DIMM and one or more Dynamic Random Access Memory (DRAM) chips associated with computing system. The one or more accelerators are configured in at least one of DIMM and one or more DRAM chips. Further, the method includes determining accelerator information for each of one or more accelerators via at least one of Serial Presence Detect (SPD) and Multi-Purpose Register (MPR) associated with at least one of DIMM and one or more DRAM chips. Method includes generating unique memory map for each of one or more accelerators based on accelerator information of corresponding one or more accelerators. As a result, performance of computing system may be improved as accelerator capabilities of one or more accelerators are effectively utilized.Type: GrantFiled: May 2, 2023Date of Patent: May 6, 2025Assignee: Samsung Electronics Co., Ltd.Inventors: Raghu Vamsi Krishna Talanki, Archita Khare, Eldho P. Mathew, Jin In So, Jong-Geon Lee, Venkata Ravi Shankar Jonnalagadda, Vishnu Charan Thummala
-
Publication number: 20250124332Abstract: A method for facilitating supervised generative optimization for synthetic data generation is disclosed. The method includes receiving, via an application programming interface, inputs that include input data and parameters; partitioning the input data to generate data sets, the data sets including training data sets, validation data sets, and test data sets; tuning hyperparameters of synthesizers by using the data sets and supervised optimization that is based on downstream performance metrics; determining a mixture distribution from among the tuned synthesizers; training machine learning models based on the mixture distribution; and generating, by using the trained machine learning models, sets of synthetic data based on the input data.Type: ApplicationFiled: October 11, 2023Publication date: April 17, 2025Applicant: JPMorgan Chase Bank, N.A.Inventors: Shinpei NAKAMURA SAKAI, Fadi HAMAD, Saheed OBITAYO, Vamsi Krishna POTLURU
-
Patent number: 12265729Abstract: This disclosure provides systems, methods, and devices for memory systems that support enhanced write buffer flush schemes. In a first aspect, a method performed by a memory controller includes detecting, by the memory controller, a flush operation associated with a write buffer. The method also includes detecting, by the memory controller during the flush operation, a command for placement into a command queue. The method further include prioritizing, by the memory controller, the flush operation to by placing the command in a wait queue and maintaining the flush operation. Other aspects and features are also claimed and described.Type: GrantFiled: January 16, 2024Date of Patent: April 1, 2025Assignee: QUALCOMM IncorporatedInventors: Vamsi Krishna Sambangi, Sai Naresh Gajapaka, Venkatesha M Iyengar, Madhu Yashwanth Boenapalli, Sai Praneeth Sreeram
-
Publication number: 20250094584Abstract: Systems and methods for improving security of a Universal Flash Storage (UFS) device during a purge operation. If a purge operation being performed in the UFS device is interrupted prematurely, due to issuance of one or more urgent commands, the UFS device notifies the host processor that the purge operation has been interrupted. After the host processor performs the urgent command(s), it causes the UFS device to resume performance of the purge operation and the host processor delays performance of any other commands that arrive at the command queue (CQ) of the host processor until the resumed purge operation has been completed. Because the purge operation is resumed and completed before the host processor performs any of the other command(s) that have arrived at the CQ, die-level attacks seeking to access unpurged data by sending read commands to the CQ of the host processor are thwarted.Type: ApplicationFiled: September 15, 2023Publication date: March 20, 2025Inventors: Vamsi Krishna SAMBANGI, Sai Naresh GAJAPAKA, Santhosh Reddy AKAVARAM, Venkatesha M IYENGAR
-
Publication number: 20250091037Abstract: The present invention relates to a high-active catalyst for hydrogenation of unsaturated hydrocarbons, particularly aromatic compounds, in middle distillate refinery streams, wherein the catalyst comprises oxides of transition metals, preferably nickel oxide and cobalt oxide impregnated on an activated alumina-ceria carrier with a loading in a range of 20 to 30 wt. % of the total dry weight of the catalyst. The active metals nickel and cobalt of the catalyst forms a bimetallic active site dispersed on the activated alumina-ceria carrier surface such that at least 50% of the loaded metal undergoes reduction with hydrogen gas in a temperature range of 150 to 850° C. The resultant catalyst has exhibited high activity for hydrogenating unsaturated hydrocarbons like aromatics in kerosene and diesel streams.Type: ApplicationFiled: September 13, 2024Publication date: March 20, 2025Inventors: Vamsi Krishna NUNNA, Kochappilly Ouseph XAVIER, Alex Cheru PULIKOTTIL, Madhusudan SAU, Sankara Sri Venkata RAMAKUMAR
-
Patent number: 12255820Abstract: An apparatus includes a memory and a processor. The memory stores information identifying a first set of protocols associated with communication channels available to a first system and a second set of protocols associated with communication channels available to a second system. The processor receives, from the first system, a request to transmit data, and selects, based at least on a size of the data and a load of each communication channel available to the first system, a protocol of the first set of protocols. The processor instructs the first system to transmit the data according to the selected protocol. The processor receives, over the communication channel associated with the selected protocol, the data from the first system. The processor further selects a protocol of the second set of protocols, and transmits, over the communication channel associated with the selected protocol, the data to the second system.Type: GrantFiled: April 11, 2022Date of Patent: March 18, 2025Assignee: Bank of America CorporationInventor: Naga Vamsi Krishna Akkapeddi
-
Publication number: 20250080499Abstract: Systems, computer program products, and methods are described herein for a cloud-based virtual private secured contained communication portal. The present disclosure is configured to receive a request from a user device to connect to one or more entity representatives; receive and authenticate authentication credentials received from the user device; generate a virtual private network (VPN) configuration for the user device; generate a secure application programming interface (API) call from the user device to one or more entity cloud services based on information contained in the request to form an operable connection between the user device and the one or more entity representatives; and block all incoming or outgoing network connections on the user device via the VPN configuration other than the those necessary to connect to the one or more entity cloud services while the user device is operably connected to the one or more entity representatives.Type: ApplicationFiled: November 19, 2024Publication date: March 6, 2025Applicant: BANK OF AMERICA CORPORATIONInventor: Naga Vamsi Krishna Akkapeddi
-
Publication number: 20250080365Abstract: A first device transmits a request message to a proxy device to forward to a second device. The request message includes a public key. The second device transmits a response message to the proxy device to forward to the first device. The response message includes a cryptographic nonce and is encrypted with the public key. The first device decrypts the response message, and generates a session key based on the nonce and a pre-shared password. The first device generates a session key and transmits a challenge response encrypted with the session key to the proxy device to forward to the second device. The second device generates the session key and decrypts the challenge response with the session key. Upon the second device confirming the challenge response such that a secure session is established, the first and second devices communicate with one another over the secure session.Type: ApplicationFiled: August 29, 2023Publication date: March 6, 2025Applicant: MICRO FOCUS LLCInventors: Vamsi Krishna, Daniel L. Christensen
-
Patent number: 12240341Abstract: A dense charging station for charging the battery may have lanes arranged in parallel, and each of the lanes may have sequential charging locations. A vehicle utilizing the charging station may position itself at the first available charging location, being receiving energy, and determine if a subsequent charging station becomes available in the lane, and then position itself at the subsequent charging location, once available. Multiple charging stations may be required to maintain a threshold power state for individual vehicles in a fleet of vehicles providing a service for a geographic region. A charging coordinator may determine when a battery of a vehicle does not satisfy a threshold power state and requires a recharge. Additionally, the charging coordinator may determine a candidate charging station from among multiple charging stations associated with the geographic region.Type: GrantFiled: June 30, 2021Date of Patent: March 4, 2025Assignee: Zoox, Inc.Inventor: Vamsi Krishna Pathipati
-
Patent number: 12236245Abstract: Methods, systems. and apparatuses for graph streaming processing are disclosed. One method includes receiving, by a thread scheduler, a group of threads, calculating a resource requirement for execution of the group of threads, calculating resource availability in a plurality of processors of each of a plurality of processor arrays, dispatching the group of threads to a selected one of plurality of processors of processor arrays, scheduling a group load instruction for all threads of the group of threads, including loading into a group load register a subset of inputs of the input tensor for processing of each thread of the group of threads, wherein the group load register provides the subset of the inputs of the input tensor to the group of threads of the selected one of the plurality of processors, wherein all threads of the group of threads are synchronized when executing the group load instruction.Type: GrantFiled: June 12, 2023Date of Patent: February 25, 2025Assignee: Blaize Inc.Inventors: Kota Vamsi Krishna Darsi, Sarvendra Govindammagari, Venkata Divyabharathi Palaparthy, Venkata Ganapathi Puppala, Satyaki Koneru
-
Publication number: 20250061125Abstract: A system for extracting and providing information from a source document includes a memory and a processor. The memory stores data associated with the source document. The processor is configured to extracts data from the source document and stores that extracted data as a first extracted data in the memory. When a request for information is received, the processor then determines if the first extracted data includes all the requested information and if it does not, the processor extracts second extracted data from the image file and stores the second extracted data along with the first extracted data and the image file in the memory. The requested information is then provided to the requesting process from the first and second extracted data.Type: ApplicationFiled: May 15, 2024Publication date: February 20, 2025Inventors: Naga Vamsi Krishna Akkapeddi, Awan Nord
-
Patent number: 12231406Abstract: Systems, computer program products, and methods are described herein for a cloud-based virtual private secured contained communication portal. The present disclosure is configured to receive a request from a user device to connect to one or more entity representatives; analyze the request to determine a device identifier and customer identification number; access an entity database and retrieve resource transfer history data and resource account data for the customer identification number; generate, near-real-time, a virtual private network (VPN) configuration for the user device; and generate a secure application programming interface (API) call from the user device to one or more entity cloud services based on information contained in the request to form an operable connection between the user device and the one or more entity representatives.Type: GrantFiled: October 21, 2022Date of Patent: February 18, 2025Assignee: BANK OF AMERICA CORPORATIONInventor: Naga Vamsi Krishna Akkapeddi
-
Publication number: 20250056828Abstract: Some implementations herein provide for a memory device and methods of formation. The memory device includes a plurality of storage cells arranged vertically and a plurality of corresponding gate all around transistors. Methods of forming the memory device include using a single trench to remove a liner material and form recesses that define cell contact lightly-doped drain regions of the gate all around transistors. Using the single trench to remove the liner material and form the recesses that define the cell contact lightly-doped drain region widths causes the cell contact lightly-doped drain regions to be formed having substantially similar widths.Type: ApplicationFiled: July 24, 2024Publication date: February 13, 2025Inventors: Si-Woo LEE, Yuichi YOKOYAMA, Scott E. SILLS, Gautham MUTHUSAMY, David HWANG, Yoshitaka NAKAMURA, Pavani Vamsi Krishna NITTALA, Yuanzhi MA, Glen H. WALTERS, Haitao LIU, Kamal M. KARDA
-
Publication number: 20250056431Abstract: This disclosure provides systems, methods, and apparatuses, including computer programs encoded on computer storage media, for wireless communication. Various aspects relate to initial physical random access channel (PRACH) power control using multiple PRACH signals, and more particularly to supporting one or more initialization and power control operations between a user equipment (UE) and a network entity prior to an association process. The initial PRACH power control may include the UE transmitting, to the network entity, a first set of PRACH signals at different respective power levels. The network entity may transmit a response message indicating a single set of PRACH signals detected by the network entity, and the UE may detect whether a first PRACH signal, of the first set of PRACH signals, is included in the single set of PRACH signals to select a power level associated with the first PRACH signal for transmitting association messages.Type: ApplicationFiled: August 7, 2023Publication date: February 13, 2025Inventors: Jing Sun, Xiaoxia Zhang, Jing Jiang, Junyi Li, Raviteja Patchava, Vamsi Krishna Amalladinne
-
Patent number: 12223479Abstract: Aspects of the disclosure relate to detecting usage issues on enterprise systems and dynamically providing user assistance. In some embodiments, a computing platform may receive, from an automated teller system, user presence information indicating that a user has been detected at an automated teller machine associated with the automated teller system. Subsequently, the computing platform may identify, based on the user presence information received from the automated teller system, that the user has a need for assistance with the automated teller machine associated with the automated teller system. In response to identifying that the user has the need for assistance, the computing platform may generate and send one or more commands directing the automated teller machine associated with the automated teller system to execute one or more automated assistance actions corresponding to the need for assistance.Type: GrantFiled: June 20, 2023Date of Patent: February 11, 2025Assignee: Bank of America CorporationInventors: Naga Vamsi Krishna Akkapeddi, Morgan S. Allen, Susan Moss, Stephen T. Shannon, Siten Sanghvi, Pratap Dande
-
Patent number: 12223188Abstract: A memory interface for interfacing with a memory device includes a control circuit configured to determine whether a trigger event has occurred for initializing one or more memory locations in the memory device, and initialize the one or more memory locations in the memory device with pre-defined data upon determining the trigger event has occurred.Type: GrantFiled: May 19, 2022Date of Patent: February 11, 2025Assignee: SAMSUNG ELECTRONICS CO., LTD.Inventors: Raghu Vamsi Krishna Talanki, Archita Khare, Rahul Tarikere Ravikumar, Jinin So, Jonggeon Lee
-
Patent number: 12223088Abstract: A method for preserving privacy with respect to modeling event sequence data is provided. The method includes: receiving information about a sequence of events; modeling the event sequence by a Hawkes process that has an intensity that includes an exogenous base intensity rate and an indigenous component that has an excitation rate and a decay rate; analyzing the received information; and determining estimated values of the exogenous base intensity rate and the excitation rate, such that an accuracy of the estimates corresponds to a length of time over which the sequence of events is observed. Differential privacy is introduced by adding noise to the sequence of events in order to preserve the privacy of individuals associated with the events, and a cost of the differential privacy is expressible as an additional length of observation time required to ensure the accuracy of the estimates.Type: GrantFiled: October 11, 2022Date of Patent: February 11, 2025Assignee: JPMORGAN CHASE BANK, N.A.Inventors: Mohsen Ghassemi, Eleonora Kreacic, Niccolo Dalmasso, Vamsi Krishna Potluru, Tucker Richard Balch, Manuela Veloso
-
Publication number: 20250044982Abstract: Aspects relate to interrupting memory access during background operations of a memory device. In one example a memory device includes a memory controller to initiate a write buffer flush operation. A bus interface is coupled to a main memory and to a write buffer to receive a write command from a host during the write buffer flush operation. The memory controller initiates the write buffer flush operation, suspends the write buffer flush operation in response to the write command, sends a last flushed address of the write buffer from the memory device to the host through the bus interface, and unmaps a portion of the write buffer using the last flushed address.Type: ApplicationFiled: August 2, 2023Publication date: February 6, 2025Inventors: Sai Naresh GAJAPAKA, Chintalapati BHARATH SAI VARMA, Santhosh Reddy AKAVARAM, Prakhar SRIVASTAVA, Vamsi Krishna SAMBANGI
-
Publication number: 20250040121Abstract: Methods, systems, and devices for multi-layer capacitors for three-dimensional memory systems are described. Memory cells of a memory system may include capacitors having dielectric material between multiple interfaces (e.g., concentric interfaces) of a bottom electrode and a top electrode. A bottom electrode may include a first portion wrapping around a portion of a semiconductor material that is contiguous with a channel of a transistor, and a top electrode may include a first portion wrapping around the first portion of the bottom electrode. The bottom electrode may also include a second portion wrapping around the first portion of the top electrode, and the top electrode may also include a second portion wrapping around the second portion of the bottom electrode. The dielectric material may include respective portions between each interface of the bottom electrode and top electrode which, in some examples, may be a contiguous implementation of the dielectric material.Type: ApplicationFiled: July 18, 2024Publication date: January 30, 2025Inventors: Yuanzhi Ma, Scott E. Sills, Si-Woo Lee, David K. Hwang, Yoshitaka Nakamura, Yuichi Yokoyama, Pavani Vamsi Krishna Nittala, Glen H. Walters, Gautham Muthusamy, Haitao Liu, Kamal Karda
-
Publication number: 20250028443Abstract: Aspects of the disclosure relate to an NFT segmentation platform. The NFT segmentation platform may train an auto-segmentation model to generate tier scores corresponding to non-fungible tokens (NFTs). The NFT segmentation platform may compare the tier scores to tier thresholds defining threshold ranges. The NFT segmentation platform may automatically store the NFTs based on storage rules corresponding to the threshold ranges. The NFT segmentation platform may modify the storage location of the NFT based on changes in the tier score. The NFT segmentation platform may train an NFT validation model to generate NFT validation ratings for NFTs. The NFT segmentation platform may compare the NFT validation ratings to threshold values to determine whether or not to execute an event processing request. The NFT segmentation platform may create an iterative feedback loop to update the auto-segmentation model and the NFT validation model.Type: ApplicationFiled: October 9, 2024Publication date: January 23, 2025Inventors: Naga Vamsi Krishna Akkapeddi, Siten Sanghvi