Patents by Inventor Bobbie Danielle Seppelt

Bobbie Danielle Seppelt has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11688203
    Abstract: Systems and methods for managing visual allocation are provided herein that use models to determine states based on visual data and, based thereon, output feedback based on the determined states. Visual data is initially obtained by a visual allocation management system. The visual data includes eye image sequences of a person in a particular state, such as engaging in a task or activity. Visual features can be identified from the visual data, such that glance information including direction and duration can be calculated. The visual data, information derived therefrom, and/or other contextual data is input into the models, which correspond to states, to calculate probabilities that the particular state that the person is engaged in is one of the modeled states. Based on the state identified as having the highest probability, an optimal feedback, such as a warning or instruction, can be output to a connected devices, systems, or objects.
    Type: Grant
    Filed: December 11, 2020
    Date of Patent: June 27, 2023
    Assignee: MASSACHUSETTS INSTITUTE OF TECHNOLOGY
    Inventors: Andres Mauricio Muñoz Delgado, Bryan L. Reimer, Joonbum Lee, Linda Sala Angell, Bobbie Danielle Seppelt, Bruce L. Mehler, Joseph F. Coughlin
  • Publication number: 20210150390
    Abstract: Systems and methods for managing visual allocation are provided herein that use models to determine states based on visual data and, based thereon, output feedback based on the determined states. Visual data is initially obtained by a visual allocation management system. The visual data includes eye image sequences of a person in a particular state, such as engaging in a task or activity. Visual features can be identified from the visual data, such that glance information including direction and duration can be calculated. The visual data, information derived therefrom, and/or other contextual data is input into the models, which correspond to states, to calculate probabilities that the particular state that the person is engaged in is one of the modeled states. Based on the state identified as having the highest probability, an optimal feedback, such as a warning or instruction, can be output to a connected devices, systems, or objects.
    Type: Application
    Filed: December 11, 2020
    Publication date: May 20, 2021
    Inventors: Andres Mauricio Muñoz Delgado, Bryan L. Reimer, Joonbum Lee, Linda Sala Angell, Bobbie Danielle Seppelt, Bruce L. Mehler, Joseph F. Coughlin
  • Patent number: 10902331
    Abstract: Systems and methods for managing visual allocation are provided herein that use models to determine states based on visual data and, based thereon, output feedback based on the determined states. Visual data is initially obtained by a visual allocation management system. The visual data includes eye image sequences of a person in a particular state, such as engaging in a task or activity. Visual features can be identified from the visual data, such that glance information including direction and duration can be calculated. The visual data, information derived therefrom, and/or other contextual data is input into the models, which correspond to states, to calculate probabilities that the particular state that the person is engaged in is one of the modeled states. Based on the state identified as having the highest probability, an optimal feedback, such as a warning or instruction, can be output to a connected devices, systems, or objects.
    Type: Grant
    Filed: August 21, 2017
    Date of Patent: January 26, 2021
    Assignee: Massachusetts Institute of Technology
    Inventors: Andres Mauricio Munoz Delgado, Bryan L. Reimer, Joonbum Lee, Linda Sala Angell, Bobbie Danielle Seppelt, Bruce L. Mehler, Joseph F. Coughlin
  • Patent number: 10525984
    Abstract: Systems and methods for assessing resource allocation are provided. In some exemplary embodiments, the system uses an attention buffer to classify glances by a person and/or automated system, the buffer determining the impact the glances have on the person and/or automated system's situation awareness level. The attention buffer calculates on a continuous basis a buffer value that is representative of the situation awareness level for the person and/or automated system at a particular moment in time. The calculated buffer values, referred to as moment-to-moment buffer values, among other names, can be used as data points, and/or they can also be used to direct action by the system and/or person to alter the situation awareness level of the person and/or automated system.
    Type: Grant
    Filed: August 21, 2017
    Date of Patent: January 7, 2020
    Assignee: Massachusetts Institute of Technology
    Inventors: Bobbie Danielle Seppelt, Joonbum Lee, Linda Sala Angell, Bryan L. Reimer, Bruce L. Mehler, Joseph F. Coughlin
  • Publication number: 20180072327
    Abstract: Systems and methods for assessing resource allocation are provided. In some exemplary embodiments, the system uses an attention buffer to classify glances by a person and/or automated system, the buffer determining the impact the glances have on the person and/or automated system's situation awareness level. The attention buffer calculates on a continuous basis a buffer value that is representative of the situation awareness level for the person and/or automated system at a particular moment in time. The calculated buffer values, referred to as moment-to-moment buffer values, among other names, can be used as data points, and/or they can also be used to direct action by the system and/or person to alter the situation awareness level of the person and/or automated system.
    Type: Application
    Filed: August 21, 2017
    Publication date: March 15, 2018
    Inventors: Bobbie Danielle Seppelt, Joonbum Lee, Linda Sala Angell, Bryan L. Reimer, Bruce L. Mehler, Joseph F. Coughlin
  • Publication number: 20180053103
    Abstract: Systems and methods for managing visual allocation are provided herein that use models to determine states based on visual data and, based thereon, output feedback based on the determined states. Visual data is initially obtained by a visual allocation management system. The visual data includes eye image sequences of a person in a particular state, such as engaging in a task or activity. Visual features can be identified from the visual data, such that glance information including direction and duration can be calculated. The visual data, information derived therefrom, and/or other contextual data is input into the models, which correspond to states, to calculate probabilities that the particular state that the person is engaged in is one of the modeled states. Based on the state identified as having the highest probability, an optimal feedback, such as a warning or instruction, can be output to a connected devices, systems, or objects.
    Type: Application
    Filed: August 21, 2017
    Publication date: February 22, 2018
    Inventors: Andres Mauricio Munoz Delgado, Bryan L. Reimer, Joonbum Lee, Linda Sala Angell, Bobbie Danielle Seppelt, Bruce L. Mehler, Joseph F. Coughlin