Patents by Inventor Christopher Fusting

Christopher Fusting has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200151590
    Abstract: The techniques herein include using an input context to determine a suggested action. One or more explanations may also be determined and returned along with the suggested action. The one or more explanations may include (i) one or more most similar cases to the suggested case (e.g., the case associated with the suggested action) and, optionally, a conviction score for each nearby cases; (ii) action probabilities, (iii) excluding cases and distances, (iv) archetype and/or counterfactual cases for the suggested action; (v) feature residuals; (vi) regional model complexity; (vii) fractional dimensionality; (viii) prediction conviction; (ix) feature prediction contribution; and/or other measures such as the ones discussed herein, including certainty. In some embodiments, the explanation data may be used to determine whether to perform a suggested action.
    Type: Application
    Filed: November 30, 2018
    Publication date: May 14, 2020
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Publication number: 20200134484
    Abstract: The techniques herein include using an input context to determine a suggested action and/or cluster. Explanations may also be determined and returned along with the suggested action. The explanations may include (i) one or more most similar cases to the suggested case (e.g., the case associated with the suggested action) and, optionally, a conviction score for each nearby cases; (ii) action probabilities, (iii) excluding cases and distances, (iv) archetype and/or counterfactual cases for the suggested action; (v) feature residuals; (vi) regional model complexity; (vii) fractional dimensionality; (viii) prediction conviction; (ix) feature prediction contribution; and/or other measures such as the ones discussed herein, including certainty. The explanation data may be used to determine whether to perform a suggested action.
    Type: Application
    Filed: October 22, 2019
    Publication date: April 30, 2020
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Patent number: 10528877
    Abstract: The techniques herein include using an input context to determine a suggested action. One or more explanations may also be determined and returned along with the suggested action. The one or more explanations may include (i) one or more most similar cases to the suggested case (e.g., the case associated with the suggested action) and, optionally, a conviction score for each nearby cases; (ii) action probabilities, (iii) excluding cases and distances, (iv) archetype and/or counterfactual cases for the suggested action; (v) feature residuals; (vi) regional model complexity; (vii) fractional dimensionality; (viii) prediction conviction; (ix) feature prediction contribution; and/or other measures such as the ones discussed herein, including certainty. In some embodiments, the explanation data may be used to determine whether to perform a suggested action.
    Type: Grant
    Filed: November 30, 2018
    Date of Patent: January 7, 2020
    Assignee: Diveplane Corporation
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Publication number: 20190311220
    Abstract: Techniques are provided herein for creating well-balanced computer-based reasoning systems and using those to control systems. The techniques include receiving a request to determine whether to use one or more particular data elements, features, cases, etc. in a computer-based reasoning model (e.g., as data elements, cases or features are being added, or as part of pruning existing features or cases). Conviction measures (such as targeted or untargeted conviction, contribution, surprisal, etc.) are determined and inclusivity conditions are tested. The result of comparing the conviction measure can be used to determine whether to include or exclude the feature, case, etc. in the computer-based reasoning model. A controllable system may then be controlled using the computer-based reasoning model.
    Type: Application
    Filed: April 5, 2019
    Publication date: October 10, 2019
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Publication number: 20190310635
    Abstract: Techniques are provided herein for creating well-balanced computer-based reasoning systems and using those to control systems. The techniques include receiving a request to determine whether to use one or more particular features, cases, etc. in a computer-based reasoning model (e.g., as cases or features are being added, or as part of pruning existing features or cases). Conviction measures (such as targeted or untargeted conviction, contribution, surprisal, etc.) are determined and inclusivity conditions are tested. The result of comparing the conviction measure can be used to determine whether to include or exclude the feature, case, etc. in the computer-based reasoning model. A controllable system may then be controlled using the computer-based reasoning model. Examples controllable systems include self-driving cars, image labeling systems, manufacturing and assembly controls, federated systems, smart voice controls, automated control of experiments, energy transfer systems, and the like.
    Type: Application
    Filed: December 14, 2018
    Publication date: October 10, 2019
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick
  • Publication number: 20190310634
    Abstract: Techniques are provided herein for creating well-balanced computer-based reasoning systems and using those to control systems. The techniques include receiving a request to determine whether to use one or more particular features, cases, etc. in a computer-based reasoning model (e.g., as cases or features are being added, or as part of pruning existing features or cases). Conviction measures (such as targeted or untargeted conviction, contribution, surprisal, etc.) are determined and inclusivity conditions are tested. The result of comparing the conviction measure can be used to determine whether to include or exclude the feature, case, etc. in the computer-based reasoning model. A controllable system may then be controlled using the computer-based reasoning model. Examples controllable systems include self-driving cars, image labeling systems, manufacturing and assembly controls, federated systems, smart voice controls, automated control of experiments, energy transfer systems, and the like.
    Type: Application
    Filed: December 14, 2018
    Publication date: October 10, 2019
    Inventors: Christopher James Hazard, Christopher Fusting, Michael Resnick