Patents by Inventor William G. Macready

William G. Macready has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20210289020
    Abstract: A digital processor runs a machine learning algorithm in parallel with a sampling server. The sampling sever may continuously or intermittently draw samples for the machine learning algorithm during execution of the machine learning algorithm, for example on a given problem. The sampling server may run in parallel (e.g., concurrently, overlapping, simultaneously) with a quantum processor to draw samples from the quantum processor.
    Type: Application
    Filed: September 26, 2017
    Publication date: September 16, 2021
    Inventors: Jason T. Rolfe, William G. Macready, Mani Ranjbar, Mayssam Mohammad Nevisi
  • Patent number: 11042811
    Abstract: A computational system can include digital circuitry and analog circuitry, for instance a digital processor and a quantum processor. The quantum processor can operate as a sample generator providing samples. Samples can be employed by the digital processing in implementing various machine learning techniques. For example, the computational system can perform unsupervised learning over an input space, for example via a discrete variational auto-encoder, and attempting to maximize the log-likelihood of an observed dataset. Maximizing the log-likelihood of the observed dataset can include generating a hierarchical approximating posterior. Unsupervised learning can include generating samples of a prior distribution using the quantum processor. Generating samples using the quantum processor can include forming chains of qubits and representing discrete variables by chains.
    Type: Grant
    Filed: October 5, 2017
    Date of Patent: June 22, 2021
    Assignee: D-WAVE SYSTEMS INC.
    Inventors: Jason Rolfe, William G. Macready, Zhengbing Bian, Fabian A. Chudak
  • Publication number: 20210089884
    Abstract: Collaborative filtering systems based on variational autoencoders (VAEs) are provided. VAEs may be trained on row-wise data without necessarily training a paired VAE on column-wise data (or vice-versa), and may optionally be trained via minibatches. The row-wise VAE models the output of the corresponding column-based VAE as a set of parameters and uses these parameters in decoding. In some implementations, a paired VAE is provided which receives column-wise data and models row-wise parameters; each of the paired VAEs may bind their learned column- or row-wise parameters to the output of the corresponding VAE. The paired VAEs may optionally be trained via minibatches. Unobserved data may be explicitly modelled. Methods for performing inference with such VAE-based collaborative filtering systems are also disclosed, as are example applications to search and anomaly detection.
    Type: Application
    Filed: December 12, 2018
    Publication date: March 25, 2021
    Inventors: William G. Macready, Jason T. Rolfe
  • Publication number: 20210019647
    Abstract: A hybrid computer comprising a quantum processor can be operated to perform a scalable comparison of high-entropy samplers. Performing a scalable comparison of high-entropy samplers can include comparing entropy and KL divergence of post-processed samplers. A hybrid computer comprising a quantum processor generates samples for machine learning. The quantum processor is trained by matching data statistics to statistics of the quantum processor. The quantum processor is tuned to match moments of the data.
    Type: Application
    Filed: September 24, 2020
    Publication date: January 21, 2021
    Inventors: William G. Macready, Firas Hamze, Fabian A. Chudak, Mani Ranjbar, Jack R. Raymond, Jason T. Rolfe
  • Publication number: 20200401916
    Abstract: Generative and inference machine learning models with discrete-variable latent spaces are provided. Discrete variables may be transformed by a smoothing transformation with overlapping conditional distributions or made natively reparametrizable by definition over a GUMBEL distribution. Models may be trained by sampling from different models in the positive and negative phase and/or sample with different frequency in the positive and negative phase. Machine learning models may be defined over high-dimensional quantum statistical systems near a phase transition to take advantage of long-range correlations. Machine learning models may be defined over graph-representable input spaces and use multiple spanning trees to form latent representations. Machine learning models may be relaxed via continuous proxies to support a greater range of training techniques, such as importance weighting. Example architectures for (discrete) variational autoencoders using such techniques are also provided.
    Type: Application
    Filed: February 7, 2019
    Publication date: December 24, 2020
    Inventors: Jason T. Rolfe, Amir H. Khoshaman, Arash Vahdat, Mohammad H. Amin, Evgeny A. Andriyash, William G. Macready
  • Patent number: 10817796
    Abstract: A hybrid computer comprising a quantum processor can be operated to perform a scalable comparison of high-entropy samplers. Performing a scalable comparison of high-entropy samplers can include comparing entropy and KL divergence of post-processed samplers. A hybrid computer comprising a quantum processor generates samples for machine learning. The quantum processor is trained by matching data statistics to statistics of the quantum processor. The quantum processor is tuned to match moments of the data.
    Type: Grant
    Filed: March 7, 2017
    Date of Patent: October 27, 2020
    Assignee: D-WAVE SYSTEMS INC.
    Inventors: William G. Macready, Firas Hamze, Fabian A. Chudak, Mani Ranjbar, Jack R. Raymond, Jason T. Rolfe
  • Publication number: 20200257984
    Abstract: The domain adaptation problem is addressed by using the predictions of a trained model over both source and target domain to retain the model with the assistance of an auxiliary model and a modified objective function. Inaccuracy in the model's predictions in the target domain is treated as noise and is reduced by using a robust learning framework during retraining, enabling unsupervised training in the target domain. Applications include object detection models, where noise in retraining is reduced by explicitly representing label noise and geometry noise in the objective function and using the ancillary model to inject information about label noise.
    Type: Application
    Filed: January 31, 2020
    Publication date: August 13, 2020
    Inventors: Arash Vahdat, Mani Ranjbar, Mehran Khodabandeh, William G. Macready, Zhengbing Bian
  • Publication number: 20200210876
    Abstract: A computational system can include digital circuitry and analog circuitry, for instance a digital processor and a quantum processor. The quantum processor can operate as a sample generator providing samples. Samples can be employed by the digital processing in implementing various machine learning techniques. For example, the digital processor can operate as a restricted Boltzmann machine. The computational system can operate as a quantum-based deep belief network operating on a training data-set.
    Type: Application
    Filed: August 18, 2016
    Publication date: July 2, 2020
    Inventors: Jason Rolfe, Dmytro Korenkevych, Mani Ranjbar, Jack R. Raymond, William G. Macready
  • Publication number: 20200167685
    Abstract: Computational systems implement problem solving using hybrid digital/quantum computing approaches. A problem may be represented as a problem graph which is larger and/or has higher connectivity than a working and/or hardware graph of a quantum processor. A quantum processor may be used determine approximate solutions, which solutions are provided as initial states to one or more digital processors which may implement classical post-processing to generate improved solutions. Techniques for solving problems on extended, more-connected, and/or “virtual full yield” variations of the processor's actual working and/or hardware graphs are provided. A method of operation in a computational system comprising a quantum processor includes partitioning a problem graph into sub-problem graphs, and embedding a sub-problem graph onto the working graph of the quantum processor. The quantum processor and a non-quantum processor-based device generate partial samples.
    Type: Application
    Filed: January 31, 2020
    Publication date: May 28, 2020
    Inventors: Murray C. Thom, Aidan P. Roy, Fabian A. Chudak, Zhengbing Bian, William G. Macready, Robert B. Israel, Kelly T. R. Boothby, Sheir Yarkoni, Yanbo Xue, Dmytro Korenkevych
  • Publication number: 20200160175
    Abstract: Fully-supervised semantic segmentation machine learning models are augmented by ancillary machine learning models which generate high-detail predictions from low-detail, weakly-supervised data. The combined model can be trained over both fully- and weakly-supervised data. Only the primary model is required for inference, post-training. The combined model can be made self-correcting during training by adjusting the ancillary model's output based on parameters learned over both the fully- and weakly-supervised data. The self-correction module may combine the output of the primary and ancillary models in various ways, including through linear combinations and via neural networks. The self-correction module and ancillary model may benefit from disclosed pre-training techniques.
    Type: Application
    Filed: November 13, 2019
    Publication date: May 21, 2020
    Inventors: Arash Vahdat, Mostafa S. Ibrahim, William G. Macready
  • Patent number: 10599988
    Abstract: Computational systems implement problem solving using hybrid digital/quantum computing approaches. A problem may be represented as a problem graph which is larger and/or has higher connectivity than a working and/or hardware graph of a quantum processor. A quantum processor may be used determine approximate solutions, which solutions are provided as initial states to one or more digital processors which may implement classical post-processing to generate improved solutions. Techniques for solving problems on extended, more-connected, and/or “virtual full yield” variations of the processor's actual working and/or hardware graphs are provided. A method of operation in a computational system comprising a quantum processor includes partitioning a problem graph into sub-problem graphs, and embedding a sub-problem graph onto the working graph of the quantum processor. The quantum processor and a non-quantum processor-based device generate partial samples.
    Type: Grant
    Filed: March 2, 2017
    Date of Patent: March 24, 2020
    Assignee: D-WAVE SYSTEMS INC.
    Inventors: Murray C. Thom, Aidan P. Roy, Fabian A. Chudak, Zhengbing Bian, William G. Macready, Robert B. Israel, Kelly T.R. Boothby, Sheir Yarkoni, Yanbo Xue, Dmytro Korenkevych
  • Publication number: 20200090050
    Abstract: Generative machine learning models, such as variational autoencoders, with comparatively sparse latent spaces are provided. Continuous latent variables are activated and/or inactivated based on a state of the latent space. Activation may be controlled by corresponding binary latent variables and/or by rectification of probability distributions defined over the latent space. Sparsification may be supported by normalization of terms, such as providing an L1 or L2 prior.
    Type: Application
    Filed: September 5, 2019
    Publication date: March 19, 2020
    Inventors: Jason T. Rolfe, Seyed Ali Saberali, William G. Macready
  • Publication number: 20200019879
    Abstract: Techniques are provided for computing problems represented as directed graphical models via quantum processors with topologies and coupling physics which correspond to undirected graphs. These include techniques for generating approximations of Bayesian networks via a quantum processor capable of computing problems based on a Markov network-based representation of such problems. Approximations may be generated by moralization of Bayesian networks to Markov networks, learning of Bayesian networks' probability distributions by Markov networks' probability distributions, or otherwise, and are trained by executing the resulting Markov network on the quantum processor.
    Type: Application
    Filed: March 19, 2019
    Publication date: January 16, 2020
    Inventors: Yanbo Xue, William G. Macready
  • Patent number: 10467543
    Abstract: Quantum processor based techniques minimize an objective function for example by operating the quantum processor as a sample generator providing low-energy samples from a probability distribution with high probability. The probability distribution is shaped to assign relative probabilities to samples based on their corresponding objective function values until the samples converge on a minimum for the objective function. Problems having a number of variables and/or a connectivity between variables that does not match that of the quantum processor may be solved. Interaction with the quantum processor may be via a digital computer. The digital computer stores a hierarchical stack of software modules to facilitate interacting with the quantum processor via various levels of programming environment, from a machine language level up to an end-use applications level.
    Type: Grant
    Filed: October 22, 2015
    Date of Patent: November 5, 2019
    Assignee: D-WAVE SYSTEMS INC.
    Inventors: William G. Macready, Mani Ranjbar, Firas Hamze, Geordie Rose, Suzanne Gildert
  • Patent number: 10318881
    Abstract: Systems, methods and aspects, and embodiments thereof relate to unsupervised or semi-supervised features learning using a quantum processor. To achieve unsupervised or semi-supervised features learning, the quantum processor is programmed to achieve Hierarchal Deep Learning (referred to as HDL) over one or more data sets. Systems and methods search for, parse, and detect maximally repeating patterns in one or more data sets or across data or data sets. Embodiments and aspects regard using sparse coding to detect maximally repeating patterns in or across data. Examples of sparse coding include L0 and L1 sparse coding. Some implementations may involve appending, incorporating or attaching labels to dictionary elements, or constituent elements of one or more dictionaries. There may be a logical association between label and the element labeled such that the process of unsupervised or semi-supervised feature learning spans both the elements and the incorporated, attached or appended label.
    Type: Grant
    Filed: June 26, 2014
    Date of Patent: June 11, 2019
    Assignee: D-WAVE SYSTEMS INC.
    Inventors: Geordie Rose, Suzanne Gildert, William G. Macready, Dominic Christoph Walliman
  • Patent number: 10275422
    Abstract: Methods and systems represent constraint as an Ising model penalty function and a penalty gap associated therewith, the penalty gap separating a set of feasible solutions to the constraint from a set of infeasible solutions to the constraint; and determines the Ising model penalty function subject to the bounds on the programmable parameters imposed by the hardware limitations of the second processor, where the penalty gap exceeds a predetermined threshold greater than zero. Such may be employed to find quantum binary optimization problems and associated gap values employing a variety of techniques.
    Type: Grant
    Filed: March 27, 2015
    Date of Patent: April 30, 2019
    Assignee: D-WAVE SYSTEMS, INC.
    Inventors: Robert Israel, William G. Macready, Zhengbing Bian, Fabian Chudak, Mani Ranjbar
  • Publication number: 20180365594
    Abstract: Generative learning by computational systems can be achieved by: forming a generative learning model comprising a constraint satisfaction problem (CSP) defined over Boolean-valued variables; describing the CSP in first-order logic which is ground to propositional satisfiability; translating the CSP to clausal form; and performing inference with at least one satisfiability (SAT) solver. A generative learning model can be formed, for example by performing perceptual recognition of a string comprising a plurality of characters, determining whether the string is syntactically valid according to a grammar, and determining whether the string is denotationally valid. Various types of processors and/or circuitry can implement such.
    Type: Application
    Filed: January 27, 2017
    Publication date: December 20, 2018
    Inventors: William G. MACREADY, Fabian Ariel CHUDAK, Zhengbing BIAN
  • Publication number: 20180101784
    Abstract: A computational system can include digital circuitry and analog circuitry, for instance a digital processor and a quantum processor. The quantum processor can operate as a sample generator providing samples. Samples can be employed by the digital processing in implementing various machine learning techniques. For example, the computational system can perform unsupervised learning over an input space, for example via a discrete variational auto-encoder, and attempting to maximize the log-likelihood of an observed dataset. Maximizing the log-likelihood of the observed dataset can include generating a hierarchical approximating posterior. Unsupervised learning can include generating samples of a prior distribution using the quantum processor. Generating samples using the quantum processor can include forming chains of qubits and representing discrete variables by chains.
    Type: Application
    Filed: October 5, 2017
    Publication date: April 12, 2018
    Inventors: Jason Rolfe, William G. Macready, Zhengbing Bian, Fabian A. Chudak
  • Publication number: 20170351974
    Abstract: Systems, methods and aspects, and embodiments thereof relate to unsupervised or semi-supervised features learning using a quantum processor. To achieve unsupervised or semi-supervised features learning, the quantum processor is programmed to achieve Hierarchal Deep Learning (referred to as HDL) over one or more data sets. Systems and methods search for, parse, and detect maximally repeating patterns in one or more data sets or across data or data sets. Embodiments and aspects regard using sparse coding to detect maximally repeating patterns in or across data. Examples of sparse coding include L0 and L1 sparse coding. Some implementations may involve appending, incorporating or attaching labels to dictionary elements, or constituent elements of one or more dictionaries. There may be a logical association between label and the element labeled such that the process of unsupervised or semi-supervised feature learning spans both the elements and the incorporated, attached or appended label.
    Type: Application
    Filed: July 3, 2017
    Publication date: December 7, 2017
    Inventors: Geordie Rose, Suzanne Gildert, William G. Macready, Dominic Christoph Walliman
  • Publication number: 20170255629
    Abstract: Computational systems implement problem solving using hybrid digital/quantum computing approaches. A problem may be represented as a problem graph which is larger and/or has higher connectivity than a working and/or hardware graph of a quantum processor. A quantum processor may be used determine approximate solutions, which solutions are provided as initial states to one or more digital processors which may implement classical post-processing to generate improved solutions. Techniques for solving problems on extended, more-connected, and/or “virtual full yield” variations of the processor's actual working and/or hardware graphs are provided. A method of operation in a computational system comprising a quantum processor includes partitioning a problem graph into sub-problem graphs, and embedding a sub-problem graph onto the working graph of the quantum processor. The quantum processor and a non-quantum processor-based device generate partial samples.
    Type: Application
    Filed: March 2, 2017
    Publication date: September 7, 2017
    Inventors: Murray C. Thom, Aidan P. Roy, Fabian A. Chudak, Zhengbing Bian, William G. Macready, Robert B. Israel, Tomas J. Boothby, Sheir Yarkoni, Yanbo Xue, Dmytro Korenkevych