Patents by Inventor Chia-Wei HSING

Chia-Wei HSING has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230409961
    Abstract: A method of applying non-linear regression on a set of data points to get an estimate is described herein. The method includes receiving a set of N datapoints, separating the set of N datapoints into Nb batches, receiving a family of fitting functions, and minimizing a log-cosh cost function for each batch by selecting parameters that minimize the log-cosh cost function. The parameters are obtained by: receiving a matrix product state (MPS) model and training the MPS to minimize loss over all the Nb batches, including choosing an MPS with M+D tensors. All tensors except D correspond to one datapoint in each of the Nb batches, D extra tensors in the MPS have a physical dimension of size M corresponding to the number of possible outputs for a given batch, and the coefficients of the tensors in the MPS minimize the log-cosh cost function sequentially over all the Nb batches.
    Type: Application
    Filed: July 5, 2022
    Publication date: December 21, 2023
    Applicant: Multiverse Computing SL
    Inventors: Chia-Wei Hsing, Román Orús, Samuel Mugel, Saeed Jahromi, Serkan Sahin, Samuel Palmer
  • Publication number: 20230409665
    Abstract: A method of embedding ordinary differential equations (ODEs) into tensor radial basis networks is presented herein. The method involves receiving a tensored basis function having D dimensions and zeroth-, first-, and second-derivative coefficients A_d, B_d, and C_d; defining A_hat, B_hat, and C_hat as a function of A, B, and D, and C_hat as function of A, C, and D, respectively; defining an orthogonal exotic algebra a, b, c; applying a, b, and c, along with A_hat, B_hat, and C_hat, as coefficients for the zeroth-derivative, first-derivative, and second-derivative terms; and embedding the updated tensored basis function by forming a matrix product state (MPS). The MPS can be trained by initializing MPS 3-tensors with random coefficients and sweeping left and right along the MPS and updating the MPS 3-tensors.
    Type: Application
    Filed: July 5, 2022
    Publication date: December 21, 2023
    Applicant: Multiverse Computing SL
    Inventors: Samuel Palmer, Raj Patel, Román Orús, Saeed Jahromi, Chia-Wei Hsing, Serkan Sahin
  • Publication number: 20230341824
    Abstract: A device or system configured to: receive a set of data associated with a monitored target, the set of data having N features, where N is a natural number greater than one; input the N features of the received set of data into a trained neural network for determining a condition or characteristic of the target with a plurality of sets of historical data associated with the target, each set of the plurality of sets of historical data having N features, the neural network at least having N inputs and one or more outputs, the neural network having one or more hidden layers, each hidden layer being a tensor network in the form of a matrix product operator, MPO, with a respective plurality of tensors and having a respective predetermined activation function per hidden layer or per tensor in the MPO; and input the N features into the neural network, determining a condition or characteristic of the target by processing the one or more outputs. Also, a device or system configured to train such neural network.
    Type: Application
    Filed: April 26, 2022
    Publication date: October 26, 2023
    Inventors: Román ORÚS, Samuel MUGEL, Serkan SAHIN, Saeed JAHROMI, Chia-Wei HSING, Raj PATEL, Samuel PALMER