Patents by Inventor Sanjay

Sanjay has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12176803
    Abstract: Increases in current drawn from power supply nodes in a computer system can result in unwanted drops in the voltages of the power supply nodes until power supply circuits can compensate for the increased load. To lessen the effects of increases in load currents, a decoupling circuit that includes a diode may be coupled to the power supply node. During a charge mode, a control circuit applies a current to the diode to store charge in the diode. During a boost mode, the control circuit can couple the diode to the power supply node. When the voltage level of the power supply node begins to drop, the diode can source a current to the power supply node using the previously stored charge. The diode may be directly coupled to the power supply node or be part of a switch-based system that employs multiple diodes to increase the discharge voltage.
    Type: Grant
    Filed: September 21, 2023
    Date of Patent: December 24, 2024
    Assignee: Apple Inc.
    Inventors: Chi Nung Ni, Sanjay Dabral
  • Patent number: 12174989
    Abstract: Systems, computer program products, and methods are described herein for transformative data analysis and data modeling in a distributed network. The present invention is configured to receive sample data from a source system into a data ingestion engine, transmit, from the data ingestion engine to a data governance engine a request for metadata of the sample data, construct metadata of the sample data, determine a data transformation of the sample data based on the schema, transform the sample data via the data transformation, and store transformed sample data and corresponding metadata in a shared data store. First and second views may also be generated, where the first view is a raw data view, and the second view is a processed view of the transformed data.
    Type: Grant
    Filed: June 14, 2023
    Date of Patent: December 24, 2024
    Assignee: BANK OF AMERICA CORPORATION
    Inventors: Ganesan Vijayan, Himanshu Goyal, Sanjay Negi
  • Patent number: 12172058
    Abstract: An iron-type golf club head having a strike plate attached to a body and creating an internal cavity. The body includes an internal recess in a back portion and at least partially containing an internal tungsten weight that does not contact a back surface of the strike plate. The strike plate forms a part of a sole portion of the golf club head. An elastomer material is contained in the internal cavity and contacts the back portion and the back surface of the strike plate, and has a maximum elastomer material depth that is greater than a maximum thickness of the strike plate. The elastomer material extends to an elastomer elevation that is greater than a maximum elevation of the internal weight.
    Type: Grant
    Filed: February 8, 2024
    Date of Patent: December 24, 2024
    Assignee: TAYLOR MADE GOLF COMPANY, INC.
    Inventors: Paul M. Demkowski, Bret H. Wahl, Scott Taylor, Sanjay Kuttappa
  • Patent number: 12175644
    Abstract: The systems and methods described can include approaches to calibrate head-mounted displays for improved viewing experiences. Some methods include receiving data of a first target image associated with an undeformed state of a first eyepiece of a head-mounted display device; receiving data of a first captured image associated with deformed state of the first eyepiece of the head-mounted display device; determining a first transformation that maps the first captured image to the image; and applying the first transformation to a subsequent image for viewing on the first eyepiece of the head-mounted display device.
    Type: Grant
    Filed: August 24, 2023
    Date of Patent: December 24, 2024
    Assignee: Magic Leap, Inc.
    Inventors: Lionel Ernest Edwin, Samuel A. Miller, Etienne Gregoire Grossmann, Brian Christopher Clark, Michael Robert Johnson, Wenyi Zhao, Nukul Sanjay Shah, Po-Kang Huang
  • Patent number: 12175211
    Abstract: Embodiments of the present invention provide a system for creating configurational blocks used for building continuous real-time software logical sequences. The system is configured for creating a set of configurational blocks associated with building one or more real-time software logical sequences, displaying the set of configurational blocks, via a graphical user interface to a user, allowing the user to select one or more configurational blocks from the set of configuration blocks, receiving the one or more configurational blocks and one or more links associated with connection of the one or more configurational blocks from the user, via the graphical user interface, and generating a continuous real-time software logical sequence based on the one or more configurational blocks and the one or more links received from the user.
    Type: Grant
    Filed: December 13, 2021
    Date of Patent: December 24, 2024
    Assignee: BANK OF AMERICA CORPORATION
    Inventors: Anton Sumin, Piedad Burnside, Sanjay Pillay
  • Patent number: 12173923
    Abstract: The present invention discloses a bladeless cooling fan, which is used to solve the problem of poor use experience of the fan in the prior art. The bladeless cooling fan comprises: a fan body, comprising an air inlet; a water cooling component, which is provided together with the air inlet, wherein the water cooling component cools the air that enters the fan body via the air inlet; a wind wheel component, which is accommodated in the fan body and arranged to generate air flow; an air outlet component, which is installed on the fan body, wherein the air outlet component is arranged to receive the air flow from the fan body and guide the air flow to be ejected to the set direction.
    Type: Grant
    Filed: March 18, 2021
    Date of Patent: December 24, 2024
    Assignee: Airtek International Corporation Limited
    Inventors: Sanjay Varma, Anand Vyas, Huoxi Zhou, Yi Pu, Hongbo Shen
  • Patent number: 12174672
    Abstract: An electronic device, method, and computer program product enable a space saving rollable display device to extend automatically in a power efficient manner. While a blade assembly is in a retracted position, a controller of the electronic device determines whether a user notification opportunity exists based on a transition between contextual states of one of stationary and changing from unattended to attended by the user; and stowed on-body of the user changing to held by the user. In response, the controller actuates a translation mechanism to slide a blade assembly, which includes a blade and flexible display, relative to a device housing of an electronic device from a retracted position to at least a partially extended position. The actuation prepares the flexible display to present user notification(s) and generates a physical indication to a user that the user notification(s) is available to be presented when the flexible display is active.
    Type: Grant
    Filed: October 4, 2023
    Date of Patent: December 24, 2024
    Assignee: Motorola Mobility LLC
    Inventors: Amit Kumar Agrawal, Xiaofeng Zhu, Sanjay Dhar
  • Patent number: 12177077
    Abstract: Disclosed are systems, methods, and computer-readable media for assuring tenant forwarding in a network environment. Network assurance can be determined in layer 1, layer 2 and layer 3 of the networked environment including, internal-internal (e.g., inter-fabric) forwarding and internal-external (e.g., outside the fabric) forwarding in the networked environment.
    Type: Grant
    Filed: January 24, 2023
    Date of Patent: December 24, 2024
    Assignee: Cisco Technology, Inc.
    Inventors: Sanchay Harneja, Sanjay Sundaresan, Harsha Jagannati
  • Patent number: 12173239
    Abstract: A process and system for upgrading an organic feedstock including providing an organic feedstock and water mixture, feeding the mixture into a high-rate, hydrothermal reactor, wherein the mixture is rapidly heated, subjected to heat, pressure, and turbulent flow, maintaining the heat and pressure of the mixture for a residence time of less than three minutes to cause the organic components of the mixture to undergo conversion reactions resulting in increased yields of distillate fuels, higher-quality kerosene and diesel fuels, and the formation of high octane naphtha compounds. Hydrocarbon products are cooled at a rate sufficient to inhibit additional reaction and recover of process heat, and depressurizing the hydrocarbon products, and separating the hydrocarbon products for further processing.
    Type: Grant
    Filed: October 22, 2013
    Date of Patent: December 24, 2024
    Assignee: Applied Research Associates, Inc.
    Inventors: Edward N. Coppola, Charles Red, Jr., Sanjay Nana
  • Patent number: 12175389
    Abstract: State of the art predictive maintenance systems that generate predictions with respect to maintenance of High Performance Computing (HPC) systems have the disadvantage that they either are reactive, or the predictions are affected due to quality issues associated with the data being collected from the HPC systems. The disclosure herein generally relates to predictive maintenance, and, more particularly, to a method and system for predictive maintenance of High Performance Computing (HPC) systems. The system performs abstraction and cleansing on performance data collected from the HPC systems, and generates a cleansed performance data, on which a Machine Leaning (ML) prediction is applied to generate predictions with respect to maintenance of the HPC systems.
    Type: Grant
    Filed: September 22, 2021
    Date of Patent: December 24, 2024
    Assignee: TATA CONSULTANCY SERVICES LIMITED
    Inventors: Rajesh Gopalrao Kulkarni, Amit Kalele, Anubhav Jain, Sanjay Lalwani, Pradeep Gameria
  • Patent number: 12174757
    Abstract: Methods and apparatuses are provided to reduce latencies associated with state transitions in die-to-die interconnect architectures. In one example, a physical layer of a die detects a first event indicating a transition to a lower power state. In response to the first event, the physical layer transitions to a lower power state where one or more clock configuration values are read from registers and stored in memory. The physical layer then detects a second event indicating a transition to an active state. In response to the second event, the physical layer reads the clock configuration values from the memory, and writes the clock configuration values to the registers. The physical layer then transitions to a power stabilization state, and remains in the power stabilization state for an amount of time to allow clocks to stabilize. The physical layer then transitions to a training state.
    Type: Grant
    Filed: June 21, 2023
    Date of Patent: December 24, 2024
    Assignee: QUALCOMM Incorporated
    Inventors: Santhosh Reddy Akavaram, Prakhar Srivastava, Sridhar Anumala, Ramacharan Sundararaman, Sonali Jabreva, Khushboo Kumari, Sanjay Verdu
  • Publication number: 20240416212
    Abstract: A communication system for athletes or hearing-impaired users is disclosed. The communication system includes a triggering device having a housing to receive an electronic whistle, such that a switch cap snaps on a whistle switch to align with a triggering switch. A main switch sandwiches the triggering switch with the switch cap, such that when the first user accesses the main switch then the whistle and triggering switch are activated simultaneously for operating the electronic whistle and the triggering device in tandem to facilitate the first user for wirelessly communicating with the second users. The communication system also includes haptic wearable devices associated with second users and having a radio transceiver to receive the transmitted radio frequency signal and generate a vibration to notify the second user about the one or more instructions by the first user.
    Type: Application
    Filed: June 13, 2023
    Publication date: December 19, 2024
    Inventors: Corina Chen, Krithik Duvvuri, Eesha Sanjay, Zhilu Xie, Varshini Vijay, Ziyi Gao
  • Publication number: 20240420320
    Abstract: The present disclosure relates to a method. The method includes accessing automatically segmented liver data and automatically segmented spleen data from a patient. The automatically segmented liver data is used to determine a liver attenuation and the automatically segmented spleen data is used to determine a spleen attenuation. A liver-to-spleen attenuation ratio is determined from the liver attenuation and the spleen attenuation.
    Type: Application
    Filed: September 6, 2023
    Publication date: December 19, 2024
    Inventors: Gourav Modanwal, Sadeer Al-Kindi, Jonathan Walker, Rohan Dhamdhere, Sanjay Rajagopalan, Anant Madabhushi
  • Publication number: 20240419738
    Abstract: A method for expanding a hierarchical taxonomy associated with a corpus of documents may include performing cluster analysis on documents associated with a leaf node of the taxonomy to determine a plurality of first clusters, each cluster of the plurality of first clusters being associated with one or more documents associated with the leaf node; determining one or more topics associated with the documents in each cluster of the plurality of first clusters; performing cluster analysis on the topics to determine a plurality of second clusters, each cluster of the plurality of second clusters being associated with one or more of the topics; determining a name for each cluster of the plurality of second clusters; and expanding the hierarchical taxonomy based on the topics associated with the documents in each cluster of the plurality of first clusters and the determined name for each cluster of the plurality of second clusters.
    Type: Application
    Filed: June 14, 2024
    Publication date: December 19, 2024
    Applicant: RELX Inc.
    Inventors: Sanjay Sharma, Mark Shewhart
  • Publication number: 20240419678
    Abstract: A data processing and analysis system that optimizes the resources to be used for data storage and refresh events. A partitioner module for a data analysis system can receive a first client criteria and a first client dataset that includes tabular data and calculate scores that are used to generate partitioning strategies. The selected partitioning strategy can be implemented to produce aggregated data that can be stored in an intelligent data mart. The partitions can then be accessed by a data visualization platform for intelligent, dynamic responses to user requests for data analyses and generation of visualizations. By providing synchronous partitioning of data (especially big data) and intelligent refresh, the data can move from the back-end to the front-end with minimal user clicks and minimal latency in performance.
    Type: Application
    Filed: June 14, 2023
    Publication date: December 19, 2024
    Inventors: Sanjay Sharma, Reema Malhotra, Prachi Rajesh Sawant, Jain Abhishek Kumar, Abhinav Kumar, Gaurav Yadav
  • Publication number: 20240419166
    Abstract: A fleet management system is disclosed. The system may include a transceiver configured to receive vehicle information from each vehicle of a vehicle fleet and environmental condition information associated with each vehicle. The fleet management system may further include a processor configured to obtain the vehicle information and the environmental condition information, and predict a future vehicle efficiency and a future vehicle health of each vehicle. Based on the prediction, the processor may calculate first resources required to operate the vehicle fleet according to a current fleet allocation. The processor may determine an updated fleet allocation for the vehicle fleet, and calculate second resources required to operate the vehicle fleet according to the updated fleet allocation. The processor may transmit an instruction to at least vehicle of the vehicle fleet to re-locate based on the updated fleet allocation when the second resources may be less than the first resources.
    Type: Application
    Filed: June 13, 2023
    Publication date: December 19, 2024
    Applicant: Ford Global Technologies, LLC
    Inventors: Stuart C. Salter, Sanjay Dayal, Pietro Buttolo, Ryan O'Gorman, Brendan Diamond, Peter Phung
  • Publication number: 20240419834
    Abstract: Systems, computer program products, and methods are described herein for transformative data analysis and data modeling in a distributed network. The present invention is configured to receive sample data from a source system into a data ingestion engine, transmit, from the data ingestion engine to a data governance engine a request for metadata of the sample data, construct metadata of the sample data, determine a data transformation of the sample data based on the schema, transform the sample data via the data transformation, and store transformed sample data and corresponding metadata in a shared data store. First and second views may also be generated, where the first view is a raw data view, and the second view is a processed view of the transformed data.
    Type: Application
    Filed: June 14, 2023
    Publication date: December 19, 2024
    Applicant: BANK OF AMERICA CORPORATION
    Inventors: Ganesan Vijayan, Himanshu Goyal, Sanjay Negi
  • Publication number: 20240419493
    Abstract: A method, computer program product, and computing system for processing workload data associated with processing a plurality of requests for an artificial intelligence (AI) model on a processing unit. A maximum number of key-value (KV) cache blocks available for the workload data is determined by simulating the workload data using a simulation engine. A token utilization for the workload data is determined based upon, at least in part, the maximum number of KV cache blocks available for the workload data. Processing unit resources are allocated for the processing unit based upon, at least in part, the token utilization.
    Type: Application
    Filed: June 14, 2023
    Publication date: December 19, 2024
    Inventors: Sanjay Ramanujan, Karthik Raman, Rakesh Kelkar, Kalyan Kumar Bhukya, Archit Shukla, Pei-Hsuan Hsieh
  • Publication number: 20240419992
    Abstract: A method is provided for performing dynamic inferencing at a node configured to communicate with other nodes of an IoT hierarchy. In the method, a schema for an asset object associated with one or more of at least one physical process and at least one physical device is received at the node. The schema is formatted according to an inferencing engine model format. An artificial intelligence model capable of being executed in an inferencing engine is received at the node. Data indicative of one or more of a current state of at least one physical process and a current state of at least one physical device is received at the node. The received data according to the schema and an inferencing engine are processed at the node. The inferencing engine generates a new predictive attribute based on the set of attributes, and the processing normalizes the received data according to the schema to generate normalized data, the normalized data includes the predictive attribute from the inferencing engine.
    Type: Application
    Filed: August 22, 2024
    Publication date: December 19, 2024
    Inventors: David Aaron Allsbrook, Eric Michael Simone, Rajas Sanjay Deshpande
  • Publication number: 20240419595
    Abstract: Techniques for coherency management based on coherent hierarchical cache line tracking are disclosed. A plurality of processor cores is accessed. Each processor of the plurality of processor cores includes a local cache. A hierarchical cache is coupled to the plurality of processor cores. The hierarchical cache is shared among the plurality of processor cores. Coherency between the plurality of processor cores and the hierarchical cache is managed by a compute coherency block (CCB). A cache line directory is provided for the CCB. The cache line directory includes a core list field and a cache line present field. A cache line operation is detected. The cache line operation is detected by the CCB. The cache line operation is represented by an entry in the cache line directory. The cache line operation is performed, based on corresponding values of the core list field and the line present field.
    Type: Application
    Filed: June 5, 2024
    Publication date: December 19, 2024
    Applicant: Akeana, Inc.
    Inventor: Sanjay Patel