Patents by Inventor Nelson George Publicover

Nelson George Publicover has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11947399
    Abstract: Systems and methods are described in which the location of a tap on the body of a handheld device is determined in real time using data streams from an embedded inertial measurement unit (IMU). Taps may be generated by striking the handheld device with an object (e.g., a finger), or by moving the handheld device in a manner that causes it to strike another object. IMU accelerometer, gyroscopic and/or orientation (relative to the magnetic and/or gravitational pull of the earth) measurements are examined for signatures that distinguish a tap at a location on the body of the device compared with signal characteristics produced by taps at other locations. Neural network and/or numerical methods may be used to perform such classifications.
    Type: Grant
    Filed: March 4, 2023
    Date of Patent: April 2, 2024
    Assignee: Kibeam Learning, Inc.
    Inventors: Nelson George Publicover, Alexander Paul Barangan, Lewis James Marggraff, Marc Michael Thomas
  • Patent number: 11941185
    Abstract: Systems and methods are described in which user reactions to one or more interactions initiated by a handheld device are a basis for controlling the broadcasting of serial content on a content delivery device such as a television or tablet. Interactions initiated on the handheld device may include visual, acoustic and/or haptic components based on the one or more contexts of concurrent (i.e., current or recently broadcast) or ensuing (i.e., not yet broadcast) content. User reactions may be determined based on device movements or orientations (sensed by a camera or inertial measurement unit), verbal replies (sensed by a microphone), and/or pushbutton indications. Based on user reactions and/or predetermined preferences, serial content may be replayed, paused, bypassed, played in slow-motion, or played in a so-called “fast forward” mode. Systems and methods may provide simple and intuitive methods to control serial content delivery that may be particularly well-suited for young children.
    Type: Grant
    Filed: December 29, 2022
    Date of Patent: March 26, 2024
    Assignee: Kibeam Learning, Inc.
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Publication number: 20240038084
    Abstract: Systems and methods are described in which one or more attributes or cues related to a discoverable object are conveyed to a user of handheld device via audible, haptic and/or visual means. Using a light beam generated by the handheld device and pointed in the same direction as a device camera, the user may point toward one or more objects in the user's environment believed to be associated with the one or more attributes or cues. Based on the region being pointed at within camera images, the handheld device may classify objects within images to determine if an object matches one or more templates or classifications of the discoverable object. A match, or mismatch, of objects being pointed at may determine subsequent actions. Systems and methods may provide simple and intuitive methods for human-machine interactions that may be particularly well-suited for learners.
    Type: Application
    Filed: May 23, 2023
    Publication date: February 1, 2024
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Publication number: 20240036619
    Abstract: Systems and methods are described in which a “one-of-N” selection may be produced based on one or more associations made by a human between a selected object from a group of “N” objects, and visual or acoustic cues produced by a handheld device. Objects may be comprised of real or virtual (i.e., displayed) objects in the environment of the handheld device user. Visual and/or acoustic cues may be generated by the handheld device under the control of the device user. Upon determining a perceived association between a cue and a selected object, the user may indicate a selection using one or more handheld device sensing elements such as a pushbutton, touch sensor, or inertial force measurement. Systems and methods may provide simple and intuitive methods to produce selected actions on the handheld device or on a connected device.
    Type: Application
    Filed: September 13, 2022
    Publication date: February 1, 2024
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Publication number: 20240036620
    Abstract: Systems and methods are described in which the location of a tap on the body of a handheld device is determined in real time using data streams from an embedded inertial measurement unit (IMU). Taps may be generated by striking the handheld device with an object (e.g., a finger), or by moving the handheld device in a manner that causes it to strike another object. IMU accelerometer, gyroscopic and/or orientation (relative to the magnetic and/or gravitational pull of the earth) measurements are examined for signatures that distinguish a tap at a location on the body of the device compared with signal characteristics produced by taps at other locations. Neural network and/or numerical methods may be used to perform such classifications.
    Type: Application
    Filed: March 4, 2023
    Publication date: February 1, 2024
    Inventors: Nelson George Publicover, Alexander Paul Barangan, Lewis James Marggraff, Marc Michael Thomas
  • Patent number: 11652654
    Abstract: Systems and methods are described in which the control of virtual activities is cooperatively shared among two or more interactive users. Within selected virtual actions, an initiation component is identified that, when enacted, automatically informs one or more shared-experience users of the intent by the initiator to complete the virtual action and triggers monitoring of the shared-experience users for indications signaling agreement to proceed. When the degree of agreement meets a predetermined threshold or is judged by the initiator to be adequate, the completion component of the virtual actions is enacted on all devices. Automatic incorporation of initiation and completion components during shared activities may allow for cooperative control of virtual activities without repeated and potentially monotonous questions and responses regarding when to proceed.
    Type: Grant
    Filed: November 22, 2021
    Date of Patent: May 16, 2023
    Assignee: KINOO, INC.
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Publication number: 20230123893
    Abstract: Systems and methods are described in which the control of virtual activities is cooperatively shared among two or more interactive users. Within selected virtual actions, an initiation component is identified that, when enacted, automatically informs one or more shared-experience users of the intent by the initiator to complete the virtual action and triggers monitoring of the shared-experience users for indications signaling agreement to proceed. When the degree of agreement meets a predetermined threshold or is judged by the initiator to be adequate, the completion component of the virtual actions is enacted on all devices. Automatic incorporation of initiation and completion components during shared activities may allow for cooperative control of virtual activities without repeated and potentially monotonous questions and responses regarding when to proceed.
    Type: Application
    Filed: November 22, 2021
    Publication date: April 20, 2023
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Patent number: 11614781
    Abstract: Systems and methods are described in which the location of a tap on the body of a handheld device is determined in real time using data streams from an embedded inertial measurement unit (IMU). Taps may be generated by striking the handheld device with an object (e.g., a finger), or by moving the handheld device in a manner that causes it to strike another object. IMU accelerometer, gyroscopic and/or orientation (relative to the magnetic and/or gravitational pull of the earth) measurements are examined for signatures that distinguish a tap at a location on the body of the device compared with signal characteristics produced by taps at other locations. Neural network and/or numerical methods may be used to perform such classifications.
    Type: Grant
    Filed: July 26, 2022
    Date of Patent: March 28, 2023
    Assignee: KINOO, INC.
    Inventors: Nelson George Publicover, Alexander Paul Barangan, Lewis James Marggraff, Marc Michael Thomas
  • Patent number: 11556755
    Abstract: Systems and methods are described to enhance interactive engagement during simultaneous delivery of serial or digital content (e.g., audio, video) to a plurality of users. A machine-based awareness of the context of the content and/or one or more user reactions to the presentation of the content may be used as a basis to interrupt content delivery in order to intersperse a snippet that includes a virtual agent with an awareness of the context(s) of the content and/or the one or more user reactions. This “contextual virtual agent” (CVA) enacts actions and/or dialog based on the one or more machine-classified contexts coupled with identified interests and/or aspirations of individuals within the group of users. The CVA may also base its activities on a machine-based awareness of “future” content that has not yet been delivered to the group, but classified by natural language and/or computer vision processing.
    Type: Grant
    Filed: June 3, 2022
    Date of Patent: January 17, 2023
    Assignee: KINOO, INC.
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Publication number: 20220309948
    Abstract: Systems and methods are described to enact machine-based, simultaneous classification of emotional and cognitive states of an individual in substantially real time. A bootstrap approach is used to identify indicators of emotional state, including boredom, engagement and feeling overwhelmed, by finding additional interactions temporally adjacent to known indicators. A similar bootstrap approach is used to identify new indicators of knowledge acquisition (or confusion), by locating frequent temporally adjacent interactions. Systems and methods may enhance teaching by identifying knowledge presentations that result in successful knowledge acquisition, and learning by allowing instructors associated with an individual to track materials that have been understood (or misunderstood). Machine-based tracking of learning may circumvent boredom by avoiding needless repetition, and facilitate teaching of synergistic topics related to materials recently understood.
    Type: Application
    Filed: June 13, 2022
    Publication date: September 29, 2022
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Publication number: 20220292327
    Abstract: Systems and methods are described to enhance interactive engagement during simultaneous delivery of serial or digital content (e.g., audio, video) to a plurality of users. A machine-based awareness of the context of the content and/or one or more user reactions to the presentation of the content may be used as a basis to interrupt content delivery in order to intersperse a snippet that includes a virtual agent with an awareness of the context(s) of the content and/or the one or more user reactions. This “contextual virtual agent” (CVA) enacts actions and/or dialog based on the one or more machine-classified contexts coupled with identified interests and/or aspirations of individuals within the group of users. The CVA may also base its activities on a machine-based awareness of “future” content that has not yet been delivered to the group, but classified by natural language and/or computer vision processing.
    Type: Application
    Filed: June 3, 2022
    Publication date: September 15, 2022
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Patent number: 11409359
    Abstract: Systems and methods are described to enact machine-based, collective control by two individuals of one or more displayed virtual objects. Collective interactions may be implemented by combining an ability to specify one or more locations on a touch-sensitive display using one or more digits of a first user with an ability to monitor a portable, handheld controller manipulated by the other user. Alternatively or in addition, pointing by the first hand to the one or more locations on a display may be enhanced by a stylus or other pointing device. The handheld controller may be tracked within camera-acquired images by following camera-trackable controller components and/or by acquiring measurements from one or more embedded internal measurement units (IMUs). Optionally, one or more switches or sensors may be included within the handheld controller, operable by one or more digits of the second hand to enable alternative virtual object display and/or menu selections during collective interactions.
    Type: Grant
    Filed: November 19, 2021
    Date of Patent: August 9, 2022
    Assignee: KINOO, Inc.
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Patent number: 11393357
    Abstract: Systems and methods are described to enact machine-based, simultaneous classification of emotional and cognitive states of an individual in substantially real time. A bootstrap approach is used to identify indicators of emotional state, including boredom, engagement and feeling overwhelmed, by finding additional interactions temporally adjacent to known indicators. A similar bootstrap approach is used to identify new indicators of knowledge acquisition (or confusion), by locating frequent temporally adjacent interactions. Systems and methods may enhance teaching by identifying knowledge presentations that result in successful knowledge acquisition, and learning by allowing instructors associated with an individual to track materials that have been understood (or misunderstood). Machine-based tracking of learning may circumvent boredom by avoiding needless repetition, and facilitate teaching of synergistic topics related to materials recently understood.
    Type: Grant
    Filed: March 12, 2021
    Date of Patent: July 19, 2022
    Assignee: Kinoo INC.
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Patent number: 11366997
    Abstract: Systems and methods are described to enhance interactive engagement during simultaneous delivery of serial or digital content (e.g., audio, video) to a plurality of users. A machine-based awareness of the context of the content and/or one or more user reactions to the presentation of the content may be used as a basis to interrupt content delivery in order to intersperse a snippet that includes a virtual agent with an awareness of the context(s) of the content and/or the one or more user reactions. This “contextual virtual agent” (CVA) enacts actions and/or dialog based on the one or more machine-classified contexts coupled with identified interests and/or aspirations of individuals within the group of users. The CVA may also base its activities on a machine-based awareness of “future” content that has not yet been delivered to the group, but classified by natural language and/or computer vision processing.
    Type: Grant
    Filed: April 17, 2021
    Date of Patent: June 21, 2022
    Assignee: KINOO, INC.
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Patent number: 11334178
    Abstract: Systems and methods to enact machine-based, substantially simultaneous, two-handed interactions with one or more displayed virtual objects. Bimanual interactions may be implemented by combining an ability to specify one or more locations on a touch-sensitive display using one or more digits of a first hand with an ability to monitor a portable, handheld controller manipulated by the other hand. Alternatively or in addition, pointing by the first hand to the one or more locations on a display may be enhanced by a stylus or other pointing device. The handheld controller may be tracked within camera-acquired images by following camera-trackable controller components and/or by acquiring measurements from one or more embedded internal measurement units (IMUs). Optionally, one or more switches or sensors may be included within the handheld controller, operable by one or more digits of the second hand to enable alternative virtual object display and/or menu selections during bimanual interactions.
    Type: Grant
    Filed: August 6, 2021
    Date of Patent: May 17, 2022
    Assignee: KINOO, Inc.
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Publication number: 20210390364
    Abstract: Systems and methods are described to enhance interactive engagement during simultaneous delivery of serial or digital content (e.g., audio, video) to a plurality of users. A machine-based awareness of the context of the content and/or one or more user reactions to the presentation of the content may be used as a basis to interrupt content delivery in order to intersperse a snippet that includes a virtual agent with an awareness of the context(s) of the content and/or the one or more user reactions. This “contextual virtual agent” (CVA) enacts actions and/or dialog based on the one or more machine-classified contexts coupled with identified interests and/or aspirations of individuals within the group of users. The CVA may also base its activities on a machine-based awareness of “future” content that has not yet been delivered to the group, but classified by natural language and/or computer vision processing.
    Type: Application
    Filed: April 17, 2021
    Publication date: December 16, 2021
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Publication number: 20210390876
    Abstract: Systems and methods are described to enact machine-based, simultaneous classification of emotional and cognitive states of an individual in substantially real time. A bootstrap approach is used to identify indicators of emotional state, including boredom, engagement and feeling overwhelmed, by finding additional interactions temporally adjacent to known indicators. A similar bootstrap approach is used to identify new indicators of knowledge acquisition (or confusion), by locating frequent temporally adjacent interactions. Systems and methods may enhance teaching by identifying knowledge presentations that result in successful knowledge acquisition, and learning by allowing instructors associated with an individual to track materials that have been understood (or misunderstood). Machine-based tracking of learning may circumvent boredom by avoiding needless repetition, and facilitate teaching of synergistic topics related to materials recently understood.
    Type: Application
    Filed: March 12, 2021
    Publication date: December 16, 2021
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Patent number: 11127131
    Abstract: Systems and methods are described for measuring, using one or more cameras, anthropomorphic features of two or more individuals to classify and/or grade whether the two or more individuals possess physical abilities to collectively perform desired physical acts, actions or activities. The performance of such actions may involve objects with measured or known dimensions, and/or one or more assistive devices to compensate for differences in anthropomorphic features. Anthropomorphic features of the two or more individuals may also be projected into the future based on attributes such as ages, medical conditions, activity levels and predispositions. Classification schemes may use neural network-based approaches and/or statistical methods based on labelled datasets of abilities of two or more individuals with given anthropomorphic features to perform selected actions.
    Type: Grant
    Filed: February 22, 2021
    Date of Patent: September 21, 2021
    Inventors: Marc Michael Thomas, Alexander Paul Barangan, Nelson George Publicover
  • Patent number: 10963816
    Abstract: Systems and methods are described for time-shifting interactions by sharing an artificial intelligence personality (AIP). An AIP is an understanding construct that may control a variety of communication experiences to support a sense of ongoing social connectedness. An AIP may be instantiated within one or more HIEs that interact with humans in a human, cartoon or pet-like manner. HIEs may include robots, robotic pets, toys, simple-to-use devices, and graphical user interfaces. The AIP may be periodically updated based on human interactions sensed by the HIEs as well as knowledge of historical and ongoing events. The systems may provide users with intuitive machine companions that exhibit an expert knowledge base and a familiar, cumulative personality. HIEs may continue to operate without interruption in the presence of telecommunications delays or interruptions, and/or the absence of one or more human participants; allowing participants to “time-shift” their sense of connectedness.
    Type: Grant
    Filed: October 27, 2020
    Date of Patent: March 30, 2021
    Assignee: KINOO, INC.
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Patent number: 10915814
    Abstract: Systems and methods are described for time-sharing interactions using a shared artificial intelligence personality (AIP) incorporated within multiple human interaction entities (HIEs). An AIP is an understanding construct that may control a variety of communication experiences to support a sense of ongoing social connectedness. An AIP may be instantiated within two or more HIEs that interact with humans in a human, cartoon or pet-like manner. HIEs may include robots, robotic pets, toys, simple-to-use devices, and graphical user interfaces. The AIP may be periodically updated based on human interactions sensed by the HIEs as well as knowledge of historical and ongoing events. The systems may provide two or more users with intuitive machine companions that exhibit an expert knowledge base and a familiar, cumulative personality.
    Type: Grant
    Filed: June 15, 2020
    Date of Patent: February 9, 2021
    Assignee: Kinoo, Inc.
    Inventors: Nelson George Publicover, Lewis James Marggraff, Mary Jo Marggraff