Patents by Inventor James Marggraff

James Marggraff has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10915814
    Abstract: Systems and methods are described for time-sharing interactions using a shared artificial intelligence personality (AIP) incorporated within multiple human interaction entities (HIEs). An AIP is an understanding construct that may control a variety of communication experiences to support a sense of ongoing social connectedness. An AIP may be instantiated within two or more HIEs that interact with humans in a human, cartoon or pet-like manner. HIEs may include robots, robotic pets, toys, simple-to-use devices, and graphical user interfaces. The AIP may be periodically updated based on human interactions sensed by the HIEs as well as knowledge of historical and ongoing events. The systems may provide two or more users with intuitive machine companions that exhibit an expert knowledge base and a familiar, cumulative personality.
    Type: Grant
    Filed: June 15, 2020
    Date of Patent: February 9, 2021
    Assignee: Kinoo, Inc.
    Inventors: Nelson George Publicover, Lewis James Marggraff, Mary Jo Marggraff
  • Publication number: 20210004680
    Abstract: Systems and methods are described for time-sharing interactions using a shared artificial intelligence personality (AIP) incorporated within multiple human interaction entities (HIEs). An AIP is an understanding construct that may control a variety of communication experiences to support a sense of ongoing social connectedness. An AIP may be instantiated within two or more HIEs that interact with humans in a human, cartoon or pet-like manner. HIEs may include robots, robotic pets, toys, simple-to-use devices, and graphical user interfaces. The AIP may be periodically updated based on human interactions sensed by the HIEs as well as knowledge of historical and ongoing events. The systems may provide two or more users with intuitive machine companions that exhibit an expert knowledge base and a familiar, cumulative personality.
    Type: Application
    Filed: June 15, 2020
    Publication date: January 7, 2021
    Inventors: Nelson George Publicover, Lewis James Marggraff, Mary Jo Marggraff
  • Publication number: 20200387226
    Abstract: Systems are presented herein, which may be implemented in a wearable device. The system is designed to allow a user to edit media images captured with the wearable device. The system employs eye tracking data to control various editing functions, whether prior to the time of capture, during the time of capture, or after the time of capture. Also presented are methods for determining which sections or regions of media images may be of greater interest to a user or viewer. The method employs eye tracking data to assign saliency to captured media. In both the system and the method, eye tracking data may be combined with data from additional sensors in order to enhance operation.
    Type: Application
    Filed: July 6, 2018
    Publication date: December 10, 2020
    Inventors: Lewis James MARGGRAFF, Eliot Francis DRAKE
  • Patent number: 10762414
    Abstract: Systems and methods are described for sharing an artificial intelligence personality (AIP) among multiple human interaction entities (HIEs). An AIP is an understanding construct that interacts with one or more humans in a human- or pet-like manner, implementing a variety of communication experiences to support a sense of ongoing social connectedness. HIEs may include robots, robotic pets, toys, and avatars. The system may be implemented using two or more HIEs and, optionally, one or more remote and/or distributed processors to compute AIPs and sequence telecommunications. AIPs are periodically updated primarily based on human interactions sensed by the two or more HIEs. HIEs may continue to operate without interruption in the presence of significant telecommunications delays. The systems may provide two or more users with intuitive machine companions that exhibit an integrated knowledge base and personality cumulatively acquired from all users.
    Type: Grant
    Filed: April 19, 2019
    Date of Patent: September 1, 2020
    Assignee: KINOO, INC.
    Inventors: Lewis James Marggraff, Nelson George Publicover, Mary Jo Marggraff
  • Patent number: 10627900
    Abstract: An electronic system tracks a user's gaze to rapidly transport a cursor to a location within a focal region of the user's eye. The electronic system transports the cursor from an initial location across one or more displays to a new location in response to detecting one or more saccadic and/or vergence movements of the user's eye or in response to a signal indicating that the user desires to move the cursor to a new location within a focal region of the user's eye or eyes. In some embodiments, the electronic system moves the cursor to the new location along a trajectory wherein the cursor is visible along at least a portion of the trajectory, enabling the user to find the cursor more easily.
    Type: Grant
    Filed: February 15, 2018
    Date of Patent: April 21, 2020
    Assignee: GOOGLE LLC
    Inventors: Nelson G. Publicover, Lewis James Marggraff, Spencer James Connaughton
  • Patent number: 10620700
    Abstract: Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system can be included within unobtrusive headwear that performs eye tracking and controls screen display. The system can also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.
    Type: Grant
    Filed: May 9, 2015
    Date of Patent: April 14, 2020
    Assignee: GOOGLE LLC
    Inventors: Nelson George Publicover, Lewis James Marggraff, Eliot Francis Drake, Spencer James Connaughton
  • Patent number: 10564714
    Abstract: Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system may be included within unobtrusive headwear that performs eye tracking and controls screen display. The system may also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.
    Type: Grant
    Filed: August 15, 2016
    Date of Patent: February 18, 2020
    Assignee: GOOGLE LLC
    Inventors: Lewis James Marggraff, Nelson George Publicover, Spencer James Connaughton, Nathan Lord, Peter Milford, Ivan Maleev
  • Patent number: 10481682
    Abstract: An electronic system generates at a display virtual writing corresponding to tracked motion of the tip of a pointer with respect to a surface based on proximity of the tip of the pointer to the surface and the gaze of a user's eye. The electronic system determines the location and motion of the tip of the pointer with respect to the surface based on images captured by scene cameras, and determines the focus and gaze direction of the user's eye based on images captured by a user-facing camera. By generating virtual writing at the display corresponding to tracked motion of the tip of the pointer based on proximity of the tip of the pointer to the surface and based on the focus and gaze direction of the user's eye, the electronic system can enable virtual writing and associated collaboration services without the need for a specialized writing surface or pointer.
    Type: Grant
    Filed: March 29, 2017
    Date of Patent: November 19, 2019
    Assignee: GOOGLE LLC
    Inventors: Lewis James Marggraff, Nelson G. Publicover, Spencer James Connaughton
  • Patent number: 10412840
    Abstract: Systems and methods are provided to produce electromechanical interconnections within integrated circuits (ICs), printed circuit boards (PCBs) and between PCBs and other electronic components such as resistors, capacitors and integrated circuits. Elements include so-called “smart pins” or “neuro-pins” that facilitate electrical pathways in the dimension normal to the plane of a PCB. Smart pins or neuro-pins may be inserted using automated processes that do not require the high temperatures normally associated with soldering. Resultant circuits generally contain a large number of layers that are more compact and more readily constructed compared with conventional PCB-based circuitry.
    Type: Grant
    Filed: October 31, 2016
    Date of Patent: September 10, 2019
    Assignee: DOTSLAM, INC.
    Inventors: Lewis James Marggraff, Nelson G. Publicover, Blake Marggraff, Edward D. Krent, Marc M. Thomas
  • Publication number: 20190179418
    Abstract: Systems are presented herein, which may be implemented in a wearable device. The system is designed to allow a user to edit media images captured with the wearable device. The system employs eye tracking data to control various editing functions, whether prior to the time of capture, during the time of capture, or after the time of capture. Also presented are methods for determining which sections or regions of media images may be of greater interest to a user or viewer. The method employs eye tracking data to assign saliency to captured media. In both the system and the method, eye tracking data may be combined with data from additional sensors in order to enhance operation.
    Type: Application
    Filed: July 6, 2018
    Publication date: June 13, 2019
    Inventors: Lewis James MARGGRAFF, Eliot Francis DRAKE
  • Publication number: 20180275753
    Abstract: An electronic system tracks a user's gaze to rapidly transport a cursor to a location within a focal region of the user's eye. The electronic system transports the cursor from an initial location across one or more displays to a new location in response to detecting one or more saccadic and/or vergence movements of the user's eye or in response to a signal indicating that the user desires to move the cursor to a new location within a focal region of the user's eye or eyes. In some embodiments, the electronic system moves the cursor to the new location along a trajectory wherein the cursor is visible along at least a portion of the trajectory, enabling the user to find the cursor more easily.
    Type: Application
    Filed: February 15, 2018
    Publication date: September 27, 2018
    Inventors: Nelson G. Publicover, Lewis James Marggraff, Spencer James Connaughton
  • Patent number: 10025379
    Abstract: Devices and methods are provided for eye-tracking, e.g., including a freeform optical assembly and/or a modular design. In an exemplary embodiment, a device and method are provided that includes a wearable device on a user's head, the wearable device including a scene camera oriented to capture images of the user's surroundings. The user may perform a predetermined action with the user's eye to activate a photo feature of the wearable device, gaze at a region within the user's surroundings, the wearable device determining a focal point and limited field-of-view for the camera imaging field based on the center point, and activate the camera to capture an image of the limited field-of-view centered around the focal point.
    Type: Grant
    Filed: December 6, 2013
    Date of Patent: July 17, 2018
    Assignee: Google LLC
    Inventors: Eliot Francis Drake, Gholamreza Amayeh, Angelique Kano, Dave Le Blanc, Zhiming Liu, Lewis James Marggraff, Rory Pierce, Nelson G. Publicover, Christopher N. Spitler, Michael Vacchina
  • Publication number: 20180173319
    Abstract: Systems are presented herein, which may be implemented in a wearable device. The system is designed to allow a user to edit media images captured with the wearable device. The system employs eye tracking data to control various editing functions, whether prior to the time of capture, during the time of capture, or after the time of capture. Also presented are methods for determining which sections or regions of media images may be of greater interest to a user or viewer. The method employs eye tracking data to assign saliency to captured media. In both the system and the method, eye tracking data may be combined with data from additional sensors in order to enhance operation.
    Type: Application
    Filed: December 8, 2017
    Publication date: June 21, 2018
    Inventors: Lewis James MARGGRAFF, Eliot Francis DRAKE
  • Patent number: 9870060
    Abstract: Systems are presented herein, which may be implemented in a wearable device. The system is designed to allow a user to edit media images captured with the wearable device. The system employs eye tracking data to control various editing functions, whether prior to the time of capture, during the time of capture, or after the time of capture. Also presented are methods for determining which sections or regions of media images may be of greater interest to a user or viewer. The method employs eye tracking data to assign saliency to captured media. In both the system and the method, eye tracking data may be combined with data from additional sensors in order to enhance operation.
    Type: Grant
    Filed: December 31, 2014
    Date of Patent: January 16, 2018
    Assignee: Google LLC
    Inventors: Lewis James Marggraff, Eliot Francis Drake
  • Publication number: 20180011533
    Abstract: Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system may be included within unobtrusive headwear that performs eye tracking and controls screen display. The system may also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.
    Type: Application
    Filed: August 15, 2016
    Publication date: January 11, 2018
    Inventors: Lewis James Marggraff, Nelson George Publicover, Spencer James Connaughton, Nathan Lord, Peter Milford, Ivan Maleev
  • Patent number: 9823744
    Abstract: Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system can be included within unobtrusive headwear that performs eye tracking and controls screen display. The system can also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.
    Type: Grant
    Filed: November 2, 2015
    Date of Patent: November 21, 2017
    Assignee: Google Inc.
    Inventors: Nelson George Publicover, Lewis James Marggraff, Eliot Francis Drake, Spencer James Connaughton
  • Publication number: 20170285742
    Abstract: An electronic system generates at a display virtual writing corresponding to tracked motion of the tip of a pointer with respect to a surface based on proximity of the tip of the pointer to the surface and the gaze of a user's eye. The electronic system determines the location and motion of the tip of the pointer with respect to the surface based on images captured by scene cameras, and determines the focus and gaze direction of the user's eye based on images captured by a user-facing camera. By generating virtual writing at the display corresponding to tracked motion of the tip of the pointer based on proximity of the tip of the pointer to the surface and based on the focus and gaze direction of the user's eye, the electronic system can enable virtual writing and associated collaboration services without the need for a specialized writing surface or pointer.
    Type: Application
    Filed: March 29, 2017
    Publication date: October 5, 2017
    Inventors: Lewis James Marggraff, Nelson G. Publicover, Spencer James Connaughton
  • Publication number: 20170123492
    Abstract: Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system may be included within unobtrusive headwear that performs eye tracking and controls screen display. The system may also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.
    Type: Application
    Filed: August 15, 2016
    Publication date: May 4, 2017
    Inventors: Lewis James Marggraff, Nelson George Publicover, Spencer James Connaughton, Nathan Lord, Peter Milford, Ivan Maleev
  • Patent number: 9600069
    Abstract: Apparatus, systems, and methods are provided for substantially continuous biometric identification (CBID) of an individual using eye signals in real time. The apparatus is included within a wearable computing device with identification of the device wearer based on iris recognition within one or more cameras directed at one or both eyes, and/or other physiological, anatomical and/or behavioral measures. Verification of device user identity can be used to enable or disable the display of secure information. Identity verification can also be included within information that is transmitted from the device in order to determine appropriate security measures by remote processing units. The apparatus may be incorporated within wearable computing that performs other functions including vision correction, head-mounted display, viewing the surrounding environment using scene camera(s), recording audio data via a microphone, and/or other sensing equipment.
    Type: Grant
    Filed: May 9, 2015
    Date of Patent: March 21, 2017
    Assignee: Google Inc.
    Inventors: Nelson George Publicover, Lewis James Marggraff
  • Patent number: 9520069
    Abstract: A method, a user interface, and an educational content server for assisting a user in learning using interactive learning appliances are disclosed. The performance information of the user, which may be in the form of a user log file in an interactive learning appliance, may be received at a server computer from a site where the user is present. The performance information and profile information can then be presented in the user interface and used to generate an electronic content package. The electronic content package can then be received and loaded into the interactive learning appliance.
    Type: Grant
    Filed: July 20, 2009
    Date of Patent: December 13, 2016
    Assignee: Leapfrog Enterprises, Inc.
    Inventors: James Marggraff, Matthew Brown, Matt Fishbach