Patents by Inventor Christopher James Hazard

Christopher James Hazard has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200371512
    Abstract: Techniques are provided herein for creating well-balanced computer-based reasoning systems and using those to control systems. The techniques include receiving a request to determine whether to use one or more particular data elements, features, cases, etc. in a computer-based reasoning model (e.g., as data elements, cases or features are being added, or as part of pruning existing features or cases). Conviction measures are determined and inclusivity conditions are tested. The result of comparing the conviction measure can be used to determine whether to include or exclude the feature, case, etc. in the model and/or whether there are anomalies in the model. A controllable system may then be controlled using the computer-based reasoning model.
    Type: Application
    Filed: August 13, 2020
    Publication date: November 26, 2020
    Inventors: Ravisutha Sakrepatna Srinivasamurthy, Christopher James Hazard, Michael Resnick, Ju Hyun Kim, Yamac Alican Isik
  • Patent number: 10845769
    Abstract: Techniques are provided for imputation in computer-based reasoning systems. The techniques include performing the following until there are no more cases in a computer-based reasoning model with missing fields for which imputation is desired: determining which cases have fields to impute (e.g., missing fields) in the computer-based reasoning model and determining conviction scores for the cases that have fields to impute. The techniques proceed by determining for which cases to impute data based on conviction scores. For each of the determined one or more cases with missing fields to impute data is imputed for the missing field, and the case is modified with the imputed data. Control of a system is then caused using the updated computer-based reasoning model.
    Type: Grant
    Filed: October 24, 2019
    Date of Patent: November 24, 2020
    Assignee: Diveplane Corporation
    Inventors: Christopher James Hazard, Michael Resnick
  • Patent number: 10817750
    Abstract: Techniques are provided herein for creating well-balanced computer-based reasoning systems and using those to control systems. The techniques include receiving a request to determine whether to use one or more particular data elements, features, cases, etc. in a computer-based reasoning model (e.g., as data elements, cases or features are being added, or as part of pruning existing features or cases). Conviction measures (such as targeted or untargeted conviction, contribution, surprisal, etc.) are determined and inclusivity conditions are tested. The result of comparing the conviction measure can be used to determine whether to include or exclude the feature, case, etc. in the computer-based reasoning model. A controllable system may then be controlled using the computer-based reasoning model.
    Type: Grant
    Filed: April 5, 2019
    Date of Patent: October 27, 2020
    Assignee: Diveplane Corporation
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Patent number: 10816980
    Abstract: Techniques are provided herein for creating well-balanced computer-based reasoning systems and using those to control systems. The techniques include receiving a request to determine whether to use one or more particular features, cases, etc. in a computer-based reasoning model (e.g., as cases or features are being added, or as part of pruning existing features or cases). Conviction measures (such as targeted or untargeted conviction, contribution, surprisal, etc.) are determined and inclusivity conditions are tested. The result of comparing the conviction measure can be used to determine whether to include or exclude the feature, case, etc. in the computer-based reasoning model. A controllable system may then be controlled using the computer-based reasoning model. Examples controllable systems include self-driving cars, image labeling systems, manufacturing and assembly controls, federated systems, smart voice controls, automated control of experiments, energy transfer systems, and the like.
    Type: Grant
    Filed: December 14, 2018
    Date of Patent: October 27, 2020
    Assignee: Diveplane Corporation
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Patent number: 10816981
    Abstract: Techniques are provided herein for creating well-balanced computer-based reasoning systems and using those to control systems. The techniques include receiving a request to determine whether to use one or more particular features, cases, etc. in a computer-based reasoning model (e.g., as cases or features are being added, or as part of pruning existing features or cases). Conviction measures (such as targeted or untargeted conviction, contribution, surprisal, etc.) are determined and inclusivity conditions are tested. The result of comparing the conviction measure can be used to determine whether to include or exclude the feature, case, etc. in the computer-based reasoning model. A controllable system may then be controlled using the computer-based reasoning model. Examples controllable systems include self-driving cars, image labeling systems, manufacturing and assembly controls, federated systems, smart voice controls, automated control of experiments, energy transfer systems, and the like.
    Type: Grant
    Filed: December 14, 2018
    Date of Patent: October 27, 2020
    Assignee: Diveplane Corporation
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Publication number: 20200234151
    Abstract: Techniques are provided for imputation in computer-based reasoning systems. The techniques include performing the following until there are no more cases in a computer-based reasoning model with missing fields for which imputation is desired: determining which cases have fields to impute (e.g., missing fields) in the computer-based reasoning model and determining conviction scores and/or imputation order information for the cases that have fields to impute. The techniques proceed by determining for which cases to impute data and, for each of the determined one or more cases with missing fields to impute data is imputed for the missing field, and the case is modified with the imputed data. Control of a system is then caused using the updated computer-based reasoning model.
    Type: Application
    Filed: January 27, 2020
    Publication date: July 23, 2020
    Inventors: Michael Resnick, Christopher James Hazard
  • Patent number: 10713570
    Abstract: Techniques are provided for determining labels associated with first and second candidate code and whether those labels are compatible. When the first candidate code and the second candidate code are compatible, third candidate code based is determined on the those two. When the third candidate code meets exit criteria the third candidate code is provided as evolved code. Some embodiments also include causing execution of the evolved code.
    Type: Grant
    Filed: March 19, 2019
    Date of Patent: July 14, 2020
    Assignee: Diveplane Corporation
    Inventor: Christopher James Hazard
  • Publication number: 20200193309
    Abstract: Techniques for synthetic data generation in computer-based reasoning systems are discussed and include receiving a request for generation of synthetic training data based on a set of training data cases. One or more focal training data cases are determined. For undetermined features (either all of them or those that are not subject to conditions), a value for the feature is determined based on the focal cases. In some embodiments, validity of the generated value may be checked based on feature information. In some embodiments, generated synthetic data may be checked against all or a portion of the training data to ensure that it is not overly similar.
    Type: Application
    Filed: December 13, 2019
    Publication date: June 18, 2020
    Inventors: Christopher James Hazard, Michael Resnick, Christopher Fusting
  • Publication number: 20200193223
    Abstract: Techniques for synthetic data generation in computer-based reasoning systems are discussed and include receiving a request for generation of synthetic training data based on a set of training data cases. One or more focal training data cases are determined. For undetermined features (either all of them or those that are not subject to conditions), a distribution for the feature among the training cases is determined, and a value for the feature is determined based on that distribution. In some embodiments, the distribution may be perturbed based on target surprisal. In some embodiments, generated synthetic data may be tested for fitness. Further, the generated synthetic data may be provided in response to a request, used to train a computer-based reasoning model, and/or used to cause control of a system.
    Type: Application
    Filed: December 13, 2018
    Publication date: June 18, 2020
    Inventors: Christopher James Hazard, Michael Resnick
  • Publication number: 20200151589
    Abstract: The techniques herein include using an input context to determine a suggested action. One or more explanations may also be determined and returned along with the suggested action. The one or more explanations may include (i) one or more most similar cases to the suggested case (e.g., the case associated with the suggested action) and, optionally, a conviction score for each nearby cases; (ii) action probabilities, (iii) excluding cases and distances, (iv) archetype and/or counterfactual cases for the suggested action; (v) feature residuals; (vi) regional model complexity; (vii) fractional dimensionality; (viii) prediction conviction; (ix) feature prediction contribution; and/or other measures such as the ones discussed herein, including certainty. In some embodiments, the explanation data may be used to determine whether to perform a suggested action.
    Type: Application
    Filed: November 30, 2018
    Publication date: May 14, 2020
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Publication number: 20200151598
    Abstract: The techniques herein include using an input context to determine a suggested action. One or more explanations may also be determined and returned along with the suggested action. The one or more explanations may include (i) one or more most similar cases to the suggested case (e.g., the case associated with the suggested action) and, optionally, a conviction score for each nearby cases; (ii) action probabilities, (iii) excluding cases and distances, (iv) archetype and/or counterfactual cases for the suggested action; (v) feature residuals; (vi) regional model complexity; (vii) fractional dimensionality; (viii) prediction conviction; (ix) feature prediction contribution; and/or other measures such as the ones discussed herein, including certainty. In some embodiments, the explanation data may be used to determine whether to perform a suggested action.
    Type: Application
    Filed: November 30, 2018
    Publication date: May 14, 2020
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Publication number: 20200151590
    Abstract: The techniques herein include using an input context to determine a suggested action. One or more explanations may also be determined and returned along with the suggested action. The one or more explanations may include (i) one or more most similar cases to the suggested case (e.g., the case associated with the suggested action) and, optionally, a conviction score for each nearby cases; (ii) action probabilities, (iii) excluding cases and distances, (iv) archetype and/or counterfactual cases for the suggested action; (v) feature residuals; (vi) regional model complexity; (vii) fractional dimensionality; (viii) prediction conviction; (ix) feature prediction contribution; and/or other measures such as the ones discussed herein, including certainty. In some embodiments, the explanation data may be used to determine whether to perform a suggested action.
    Type: Application
    Filed: November 30, 2018
    Publication date: May 14, 2020
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Publication number: 20200134484
    Abstract: The techniques herein include using an input context to determine a suggested action and/or cluster. Explanations may also be determined and returned along with the suggested action. The explanations may include (i) one or more most similar cases to the suggested case (e.g., the case associated with the suggested action) and, optionally, a conviction score for each nearby cases; (ii) action probabilities, (iii) excluding cases and distances, (iv) archetype and/or counterfactual cases for the suggested action; (v) feature residuals; (vi) regional model complexity; (vii) fractional dimensionality; (viii) prediction conviction; (ix) feature prediction contribution; and/or other measures such as the ones discussed herein, including certainty. The explanation data may be used to determine whether to perform a suggested action.
    Type: Application
    Filed: October 22, 2019
    Publication date: April 30, 2020
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Publication number: 20200125968
    Abstract: Techniques for detecting and correcting anomalies in computer-based reasoning systems are provided herein. The techniques can include obtaining current context data and determining a contextually-determined action based on the obtained context data and a reasoning model. The reasoning model may have been determined based on one or more sets of training data. The techniques may cause performance of the contextually-determined action and, potentially, receiving an indication that performing the contextually-determined action in the current context resulted in an anomaly. The techniques include determining a portion of the reasoning model that caused the determination of the contextually-determined action based on the obtained context data and causing removal of the portion of the model that caused the determination of the contextually-determined action, to produce a corrected reasoning model.
    Type: Application
    Filed: February 20, 2018
    Publication date: April 23, 2020
    Inventor: CHRISTOPHER JAMES HAZARD
  • Publication number: 20200089173
    Abstract: Techniques are provided for imputation in computer-based reasoning systems. The techniques include performing the following until there are no more cases in a computer-based reasoning model with missing fields for which imputation is desired: determining which cases have fields to impute (e.g., missing fields) in the computer-based reasoning model and determining conviction scores for the cases that have fields to impute. The techniques proceed by determining for which cases to impute data based on conviction scores. For each of the determined one or more cases with missing fields to impute data is imputed for the missing field, and the case is modified with the imputed data. Control of a system is then caused using the updated computer-based reasoning model.
    Type: Application
    Filed: October 24, 2019
    Publication date: March 19, 2020
    Inventors: Christopher James Hazard, Michael Resnick
  • Patent number: 10546240
    Abstract: Techniques are provided for imputation in computer-based reasoning systems. The techniques include performing the following until there are no more cases in a computer-based reasoning model with missing fields for which imputation is desired: determining which cases have fields to impute (e.g., missing fields) in the computer-based reasoning model and determining conviction scores for the cases that have fields to impute. The techniques proceed by determining for which cases to impute data based on the conviction scores. For each of the determined one or more cases with missing fields to impute data is imputed for the missing field, and the case is modified with the imputed data. Control of a system is then caused using the updated computer-based reasoning model.
    Type: Grant
    Filed: September 13, 2018
    Date of Patent: January 28, 2020
    Assignee: Diveplane Corporation
    Inventors: Michael Resnick, Christopher James Hazard
  • Patent number: 10528877
    Abstract: The techniques herein include using an input context to determine a suggested action. One or more explanations may also be determined and returned along with the suggested action. The one or more explanations may include (i) one or more most similar cases to the suggested case (e.g., the case associated with the suggested action) and, optionally, a conviction score for each nearby cases; (ii) action probabilities, (iii) excluding cases and distances, (iv) archetype and/or counterfactual cases for the suggested action; (v) feature residuals; (vi) regional model complexity; (vii) fractional dimensionality; (viii) prediction conviction; (ix) feature prediction contribution; and/or other measures such as the ones discussed herein, including certainty. In some embodiments, the explanation data may be used to determine whether to perform a suggested action.
    Type: Grant
    Filed: November 30, 2018
    Date of Patent: January 7, 2020
    Assignee: Diveplane Corporation
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Publication number: 20190310592
    Abstract: Techniques are provided herein for creating well-balanced computer-based reasoning systems and using those to control systems. The techniques include receiving a request to determine whether to include or select one or more aspects in a computer-based reasoning model and determining two probability density or mass functions (“PDMFs”), one for the data set including the one or more particular aspects, once for the data set excluding it. Surprisal is determined based on those two PDMFs, and inclusion or selection in the computer-based reasoning model is determined based on surprisal. A system is later controlled using the computer-based reasoning model.
    Type: Application
    Filed: April 9, 2018
    Publication date: October 10, 2019
    Inventor: CHRISTOPHER JAMES HAZARD
  • Publication number: 20190310591
    Abstract: Techniques are provided herein for creating well-balanced computer-based reasoning systems and using those to control systems. The techniques include receiving a request to determine whether to include one or more particular data elements in a computer-based reasoning model and determining two probability density or mass functions (“PDMFs”), one for the data set including the one or more particular data elements, once for the data set excluding it. Surprisal is determined based on those two PDMFs, and inclusion in the computer-based reasoning model is determined based on surprisal. A system is later controlled using the computer-based reasoning model.
    Type: Application
    Filed: April 9, 2018
    Publication date: October 10, 2019
    Inventor: CHRISTOPHER JAMES HAZARD
  • Publication number: 20190311220
    Abstract: Techniques are provided herein for creating well-balanced computer-based reasoning systems and using those to control systems. The techniques include receiving a request to determine whether to use one or more particular data elements, features, cases, etc. in a computer-based reasoning model (e.g., as data elements, cases or features are being added, or as part of pruning existing features or cases). Conviction measures (such as targeted or untargeted conviction, contribution, surprisal, etc.) are determined and inclusivity conditions are tested. The result of comparing the conviction measure can be used to determine whether to include or exclude the feature, case, etc. in the computer-based reasoning model. A controllable system may then be controlled using the computer-based reasoning model.
    Type: Application
    Filed: April 5, 2019
    Publication date: October 10, 2019
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick