Patents by Inventor James Marggraff

James Marggraff has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12282613
    Abstract: Systems and methods are described in which a light beam generated by a handheld device is used to point toward a selected object or location within a printed page containing visual content, for example displayed in a book or magazine. The object(s) and/or location being pointed at may be determined using a device camera pointed in the same direction as the beam coupled with computer vision methods to determine locations of camera-acquired images within a database of page-content templates. The database may include interactive sequences and/or scenarios associated with page objects and locations. Such interactive sequences may augment the visual content of a printed book to include audible elements, additional visual components, and/or haptic stimulation enacted by the handheld device. Systems and methods may provide simple and intuitive methods for human-machine interactions that may be particularly well-suited for children and other learners.
    Type: Grant
    Filed: March 6, 2024
    Date of Patent: April 22, 2025
    Assignee: KIBEAM LEARNING, INC.
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Publication number: 20250013320
    Abstract: Systems and methods are described in which a light beam generated by a handheld device is used to point toward a printed page containing visual contents, for example displayed in a book or magazine. The page being pointed at may be identified using computer vison applied to images acquired by a device camera pointed in the same direction as the beam. One or more actions may be performed by the device upon identifying that the user has turned to a new page or page section. Such actions may augment the visual content of a printed book to include audible elements, additional visual components, and/or haptic stimulation enacted by the handheld device. Systems and methods may provide simple and intuitive methods for human-machine interactions that may be particularly well-suited for children and other learners.
    Type: Application
    Filed: July 1, 2024
    Publication date: January 9, 2025
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Patent number: 12125407
    Abstract: Systems and methods are described in which a light beam generated by a handheld device is used to point toward a selected object or location within a printed page containing visual content. The object(s) and/or location being pointed at may be determined using a device camera pointed in the same direction as the beam coupled with computer vision methods to determine locations of camera-acquired images within a database of page-content templates. The database may include interactive sequences and/or scenarios associated with page objects and locations, which may augment the visual content of a printed book to include audible elements, additional visual components, and/or haptic stimulation enacted by the handheld device. The device may also generate a beam image reflection on a viewable surface that is in focus at a predetermined distance to encourage a user to position the handheld device at the predetermined distance from the viewable surface.
    Type: Grant
    Filed: October 20, 2023
    Date of Patent: October 22, 2024
    Assignee: KIBEAM LEARNING, INC.
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Publication number: 20240256057
    Abstract: Systems and methods are described in which a light beam generated by a handheld device is used to point toward a selected object or location within a printed page containing visual content, for example displayed in a book or magazine. The object(s) and/or location being pointed at may be determined using a device camera pointed in the same direction as the beam coupled with computer vision methods to determine locations of camera-acquired images within a database of page-content templates. The database may include interactive sequences and/or scenarios associated with page objects and locations. Such interactive sequences may augment the visual content of a printed book to include audible elements, additional visual components, and/or haptic stimulation enacted by the handheld device. Systems and methods may provide simple and intuitive methods for human-machine interactions that may be particularly well-suited for children and other learners.
    Type: Application
    Filed: March 6, 2024
    Publication date: August 1, 2024
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Publication number: 20240257658
    Abstract: Systems and methods are described in which a light beam generated by a handheld device is used to point toward a selected object or location within a printed page containing visual content. The object(s) and/or location being pointed at may be determined using a device camera pointed in the same direction as the beam coupled with computer vision methods to determine locations of camera-acquired images within a database of page-content templates. The database may include interactive sequences and/or scenarios associated with page objects and locations, which may augment the visual content of a printed book to include audible elements, additional visual components, and/or haptic stimulation enacted by the handheld device. The device may also generate a beam image reflection on a viewable surface that is in focus at a predetermined distance to encourage a user to position the handheld device at the predetermined distance from the viewable surface.
    Type: Application
    Filed: October 20, 2023
    Publication date: August 1, 2024
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Patent number: 12033531
    Abstract: Systems and methods are described in which a “one-of-N” selection may be produced based on one or more associations made by a human between a selected object from a group of “N” objects, and visual or acoustic cues produced by a handheld device. Objects may be comprised of real or virtual (i.e., displayed) objects in the environment of the handheld device user. Visual and/or acoustic cues may be generated by the handheld device under the control of the device user. Upon determining a perceived association between a cue and a selected object, the user may indicate a selection using one or more handheld device sensing elements such as a pushbutton, touch sensor, or inertial force measurement. Systems and methods may provide simple and intuitive methods to produce selected actions on the handheld device or on a connected device.
    Type: Grant
    Filed: September 13, 2022
    Date of Patent: July 9, 2024
    Assignee: Kibeam Learning, Inc.
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Publication number: 20240220034
    Abstract: Systems and methods are described in which user reactions to one or more interactions initiated by a handheld device are a basis for controlling the broadcasting of serial content on a content delivery device such as a television or tablet. Interactions initiated on the handheld device may include visual, acoustic and/or haptic components based on the one or more contexts of concurrent (i.e., current or recently broadcast) or ensuing (i.e., not yet broadcast) content. User reactions may be determined based on device movements or orientations (sensed by a camera or inertial measurement unit), verbal replies (sensed by a microphone), and/or pushbutton indications. Based on user reactions and/or predetermined preferences, serial content may be replayed, paused, bypassed, played in slow-motion, or played in a so-called “fast forward” mode. Systems and methods may provide simple and intuitive methods to control serial content delivery that may be particularly well-suited for young children.
    Type: Application
    Filed: February 26, 2024
    Publication date: July 4, 2024
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Patent number: 11989357
    Abstract: Systems and methods are described in which a light beam generated by a handheld device is used to point toward a selected object or location within a printed page containing visual content, for example displayed in a book or magazine. The object(s) and/or location being pointed at may be determined using a device camera pointed in the same direction as the beam coupled with computer vision methods to determine locations of camera-acquired images within a database of page-content templates. The database may include interactive sequences and/or scenarios associated with page objects and locations. Such interactive sequences may augment the visual content of a printed book to include audible elements, additional visual components, and/or haptic stimulation enacted by the handheld device. Systems and methods may provide simple and intuitive methods for human-machine interactions that may be particularly well-suited for children and other learners.
    Type: Grant
    Filed: July 11, 2023
    Date of Patent: May 21, 2024
    Assignee: Kibeam Learning, Inc.
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Patent number: 11947399
    Abstract: Systems and methods are described in which the location of a tap on the body of a handheld device is determined in real time using data streams from an embedded inertial measurement unit (IMU). Taps may be generated by striking the handheld device with an object (e.g., a finger), or by moving the handheld device in a manner that causes it to strike another object. IMU accelerometer, gyroscopic and/or orientation (relative to the magnetic and/or gravitational pull of the earth) measurements are examined for signatures that distinguish a tap at a location on the body of the device compared with signal characteristics produced by taps at other locations. Neural network and/or numerical methods may be used to perform such classifications.
    Type: Grant
    Filed: March 4, 2023
    Date of Patent: April 2, 2024
    Assignee: Kibeam Learning, Inc.
    Inventors: Nelson George Publicover, Alexander Paul Barangan, Lewis James Marggraff, Marc Michael Thomas
  • Patent number: 11941185
    Abstract: Systems and methods are described in which user reactions to one or more interactions initiated by a handheld device are a basis for controlling the broadcasting of serial content on a content delivery device such as a television or tablet. Interactions initiated on the handheld device may include visual, acoustic and/or haptic components based on the one or more contexts of concurrent (i.e., current or recently broadcast) or ensuing (i.e., not yet broadcast) content. User reactions may be determined based on device movements or orientations (sensed by a camera or inertial measurement unit), verbal replies (sensed by a microphone), and/or pushbutton indications. Based on user reactions and/or predetermined preferences, serial content may be replayed, paused, bypassed, played in slow-motion, or played in a so-called “fast forward” mode. Systems and methods may provide simple and intuitive methods to control serial content delivery that may be particularly well-suited for young children.
    Type: Grant
    Filed: December 29, 2022
    Date of Patent: March 26, 2024
    Assignee: Kibeam Learning, Inc.
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Publication number: 20240038084
    Abstract: Systems and methods are described in which one or more attributes or cues related to a discoverable object are conveyed to a user of handheld device via audible, haptic and/or visual means. Using a light beam generated by the handheld device and pointed in the same direction as a device camera, the user may point toward one or more objects in the user's environment believed to be associated with the one or more attributes or cues. Based on the region being pointed at within camera images, the handheld device may classify objects within images to determine if an object matches one or more templates or classifications of the discoverable object. A match, or mismatch, of objects being pointed at may determine subsequent actions. Systems and methods may provide simple and intuitive methods for human-machine interactions that may be particularly well-suited for learners.
    Type: Application
    Filed: May 23, 2023
    Publication date: February 1, 2024
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Publication number: 20240036619
    Abstract: Systems and methods are described in which a “one-of-N” selection may be produced based on one or more associations made by a human between a selected object from a group of “N” objects, and visual or acoustic cues produced by a handheld device. Objects may be comprised of real or virtual (i.e., displayed) objects in the environment of the handheld device user. Visual and/or acoustic cues may be generated by the handheld device under the control of the device user. Upon determining a perceived association between a cue and a selected object, the user may indicate a selection using one or more handheld device sensing elements such as a pushbutton, touch sensor, or inertial force measurement. Systems and methods may provide simple and intuitive methods to produce selected actions on the handheld device or on a connected device.
    Type: Application
    Filed: September 13, 2022
    Publication date: February 1, 2024
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Publication number: 20240036620
    Abstract: Systems and methods are described in which the location of a tap on the body of a handheld device is determined in real time using data streams from an embedded inertial measurement unit (IMU). Taps may be generated by striking the handheld device with an object (e.g., a finger), or by moving the handheld device in a manner that causes it to strike another object. IMU accelerometer, gyroscopic and/or orientation (relative to the magnetic and/or gravitational pull of the earth) measurements are examined for signatures that distinguish a tap at a location on the body of the device compared with signal characteristics produced by taps at other locations. Neural network and/or numerical methods may be used to perform such classifications.
    Type: Application
    Filed: March 4, 2023
    Publication date: February 1, 2024
    Inventors: Nelson George Publicover, Alexander Paul Barangan, Lewis James Marggraff, Marc Michael Thomas
  • Patent number: 11652654
    Abstract: Systems and methods are described in which the control of virtual activities is cooperatively shared among two or more interactive users. Within selected virtual actions, an initiation component is identified that, when enacted, automatically informs one or more shared-experience users of the intent by the initiator to complete the virtual action and triggers monitoring of the shared-experience users for indications signaling agreement to proceed. When the degree of agreement meets a predetermined threshold or is judged by the initiator to be adequate, the completion component of the virtual actions is enacted on all devices. Automatic incorporation of initiation and completion components during shared activities may allow for cooperative control of virtual activities without repeated and potentially monotonous questions and responses regarding when to proceed.
    Type: Grant
    Filed: November 22, 2021
    Date of Patent: May 16, 2023
    Assignee: KINOO, INC.
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Publication number: 20230123893
    Abstract: Systems and methods are described in which the control of virtual activities is cooperatively shared among two or more interactive users. Within selected virtual actions, an initiation component is identified that, when enacted, automatically informs one or more shared-experience users of the intent by the initiator to complete the virtual action and triggers monitoring of the shared-experience users for indications signaling agreement to proceed. When the degree of agreement meets a predetermined threshold or is judged by the initiator to be adequate, the completion component of the virtual actions is enacted on all devices. Automatic incorporation of initiation and completion components during shared activities may allow for cooperative control of virtual activities without repeated and potentially monotonous questions and responses regarding when to proceed.
    Type: Application
    Filed: November 22, 2021
    Publication date: April 20, 2023
    Inventors: Lewis James Marggraff, Nelson George Publicover
  • Patent number: 11614781
    Abstract: Systems and methods are described in which the location of a tap on the body of a handheld device is determined in real time using data streams from an embedded inertial measurement unit (IMU). Taps may be generated by striking the handheld device with an object (e.g., a finger), or by moving the handheld device in a manner that causes it to strike another object. IMU accelerometer, gyroscopic and/or orientation (relative to the magnetic and/or gravitational pull of the earth) measurements are examined for signatures that distinguish a tap at a location on the body of the device compared with signal characteristics produced by taps at other locations. Neural network and/or numerical methods may be used to perform such classifications.
    Type: Grant
    Filed: July 26, 2022
    Date of Patent: March 28, 2023
    Assignee: KINOO, INC.
    Inventors: Nelson George Publicover, Alexander Paul Barangan, Lewis James Marggraff, Marc Michael Thomas
  • Patent number: 11556755
    Abstract: Systems and methods are described to enhance interactive engagement during simultaneous delivery of serial or digital content (e.g., audio, video) to a plurality of users. A machine-based awareness of the context of the content and/or one or more user reactions to the presentation of the content may be used as a basis to interrupt content delivery in order to intersperse a snippet that includes a virtual agent with an awareness of the context(s) of the content and/or the one or more user reactions. This “contextual virtual agent” (CVA) enacts actions and/or dialog based on the one or more machine-classified contexts coupled with identified interests and/or aspirations of individuals within the group of users. The CVA may also base its activities on a machine-based awareness of “future” content that has not yet been delivered to the group, but classified by natural language and/or computer vision processing.
    Type: Grant
    Filed: June 3, 2022
    Date of Patent: January 17, 2023
    Assignee: KINOO, INC.
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Publication number: 20220309948
    Abstract: Systems and methods are described to enact machine-based, simultaneous classification of emotional and cognitive states of an individual in substantially real time. A bootstrap approach is used to identify indicators of emotional state, including boredom, engagement and feeling overwhelmed, by finding additional interactions temporally adjacent to known indicators. A similar bootstrap approach is used to identify new indicators of knowledge acquisition (or confusion), by locating frequent temporally adjacent interactions. Systems and methods may enhance teaching by identifying knowledge presentations that result in successful knowledge acquisition, and learning by allowing instructors associated with an individual to track materials that have been understood (or misunderstood). Machine-based tracking of learning may circumvent boredom by avoiding needless repetition, and facilitate teaching of synergistic topics related to materials recently understood.
    Type: Application
    Filed: June 13, 2022
    Publication date: September 29, 2022
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Publication number: 20220292327
    Abstract: Systems and methods are described to enhance interactive engagement during simultaneous delivery of serial or digital content (e.g., audio, video) to a plurality of users. A machine-based awareness of the context of the content and/or one or more user reactions to the presentation of the content may be used as a basis to interrupt content delivery in order to intersperse a snippet that includes a virtual agent with an awareness of the context(s) of the content and/or the one or more user reactions. This “contextual virtual agent” (CVA) enacts actions and/or dialog based on the one or more machine-classified contexts coupled with identified interests and/or aspirations of individuals within the group of users. The CVA may also base its activities on a machine-based awareness of “future” content that has not yet been delivered to the group, but classified by natural language and/or computer vision processing.
    Type: Application
    Filed: June 3, 2022
    Publication date: September 15, 2022
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Patent number: 11409359
    Abstract: Systems and methods are described to enact machine-based, collective control by two individuals of one or more displayed virtual objects. Collective interactions may be implemented by combining an ability to specify one or more locations on a touch-sensitive display using one or more digits of a first user with an ability to monitor a portable, handheld controller manipulated by the other user. Alternatively or in addition, pointing by the first hand to the one or more locations on a display may be enhanced by a stylus or other pointing device. The handheld controller may be tracked within camera-acquired images by following camera-trackable controller components and/or by acquiring measurements from one or more embedded internal measurement units (IMUs). Optionally, one or more switches or sensors may be included within the handheld controller, operable by one or more digits of the second hand to enable alternative virtual object display and/or menu selections during collective interactions.
    Type: Grant
    Filed: November 19, 2021
    Date of Patent: August 9, 2022
    Assignee: KINOO, Inc.
    Inventors: Lewis James Marggraff, Nelson George Publicover