Patents by Inventor Lewis James Marggraff
Lewis James Marggraff has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Systems and methods for time-sharing interactions using a shared artificial intelligence personality
Patent number: 10915814Abstract: Systems and methods are described for time-sharing interactions using a shared artificial intelligence personality (AIP) incorporated within multiple human interaction entities (HIEs). An AIP is an understanding construct that may control a variety of communication experiences to support a sense of ongoing social connectedness. An AIP may be instantiated within two or more HIEs that interact with humans in a human, cartoon or pet-like manner. HIEs may include robots, robotic pets, toys, simple-to-use devices, and graphical user interfaces. The AIP may be periodically updated based on human interactions sensed by the HIEs as well as knowledge of historical and ongoing events. The systems may provide two or more users with intuitive machine companions that exhibit an expert knowledge base and a familiar, cumulative personality.Type: GrantFiled: June 15, 2020Date of Patent: February 9, 2021Assignee: Kinoo, Inc.Inventors: Nelson George Publicover, Lewis James Marggraff, Mary Jo Marggraff -
Patent number: 10915180Abstract: Systems are presented herein, which may be implemented in a wearable device. The system is designed to allow a user to edit media images captured with the wearable device. The system employs eye tracking data to control various editing functions, whether prior to the time of capture, during the time of capture, or after the time of capture. Also presented are methods for determining which sections or regions of media images may be of greater interest to a user or viewer. The method employs eye tracking data to assign saliency to captured media. In both the system and the method, eye tracking data may be combined with data from additional sensors in order to enhance operation.Type: GrantFiled: July 6, 2018Date of Patent: February 9, 2021Assignee: Google LLCInventors: Lewis James Marggraff, Eliot Francis Drake
-
SYSTEMS AND METHODS FOR TIME-SHARING INTERACTIONS USING A SHARED ARTIFICIAL INTELLIGENCE PERSONALITY
Publication number: 20210004680Abstract: Systems and methods are described for time-sharing interactions using a shared artificial intelligence personality (AIP) incorporated within multiple human interaction entities (HIEs). An AIP is an understanding construct that may control a variety of communication experiences to support a sense of ongoing social connectedness. An AIP may be instantiated within two or more HIEs that interact with humans in a human, cartoon or pet-like manner. HIEs may include robots, robotic pets, toys, simple-to-use devices, and graphical user interfaces. The AIP may be periodically updated based on human interactions sensed by the HIEs as well as knowledge of historical and ongoing events. The systems may provide two or more users with intuitive machine companions that exhibit an expert knowledge base and a familiar, cumulative personality.Type: ApplicationFiled: June 15, 2020Publication date: January 7, 2021Inventors: Nelson George Publicover, Lewis James Marggraff, Mary Jo Marggraff -
Publication number: 20200387226Abstract: Systems are presented herein, which may be implemented in a wearable device. The system is designed to allow a user to edit media images captured with the wearable device. The system employs eye tracking data to control various editing functions, whether prior to the time of capture, during the time of capture, or after the time of capture. Also presented are methods for determining which sections or regions of media images may be of greater interest to a user or viewer. The method employs eye tracking data to assign saliency to captured media. In both the system and the method, eye tracking data may be combined with data from additional sensors in order to enhance operation.Type: ApplicationFiled: July 6, 2018Publication date: December 10, 2020Inventors: Lewis James MARGGRAFF, Eliot Francis DRAKE
-
Patent number: 10762414Abstract: Systems and methods are described for sharing an artificial intelligence personality (AIP) among multiple human interaction entities (HIEs). An AIP is an understanding construct that interacts with one or more humans in a human- or pet-like manner, implementing a variety of communication experiences to support a sense of ongoing social connectedness. HIEs may include robots, robotic pets, toys, and avatars. The system may be implemented using two or more HIEs and, optionally, one or more remote and/or distributed processors to compute AIPs and sequence telecommunications. AIPs are periodically updated primarily based on human interactions sensed by the two or more HIEs. HIEs may continue to operate without interruption in the presence of significant telecommunications delays. The systems may provide two or more users with intuitive machine companions that exhibit an integrated knowledge base and personality cumulatively acquired from all users.Type: GrantFiled: April 19, 2019Date of Patent: September 1, 2020Assignee: KINOO, INC.Inventors: Lewis James Marggraff, Nelson George Publicover, Mary Jo Marggraff
-
Patent number: 10627900Abstract: An electronic system tracks a user's gaze to rapidly transport a cursor to a location within a focal region of the user's eye. The electronic system transports the cursor from an initial location across one or more displays to a new location in response to detecting one or more saccadic and/or vergence movements of the user's eye or in response to a signal indicating that the user desires to move the cursor to a new location within a focal region of the user's eye or eyes. In some embodiments, the electronic system moves the cursor to the new location along a trajectory wherein the cursor is visible along at least a portion of the trajectory, enabling the user to find the cursor more easily.Type: GrantFiled: February 15, 2018Date of Patent: April 21, 2020Assignee: GOOGLE LLCInventors: Nelson G. Publicover, Lewis James Marggraff, Spencer James Connaughton
-
Patent number: 10620700Abstract: Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system can be included within unobtrusive headwear that performs eye tracking and controls screen display. The system can also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.Type: GrantFiled: May 9, 2015Date of Patent: April 14, 2020Assignee: GOOGLE LLCInventors: Nelson George Publicover, Lewis James Marggraff, Eliot Francis Drake, Spencer James Connaughton
-
Patent number: 10564714Abstract: Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system may be included within unobtrusive headwear that performs eye tracking and controls screen display. The system may also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.Type: GrantFiled: August 15, 2016Date of Patent: February 18, 2020Assignee: GOOGLE LLCInventors: Lewis James Marggraff, Nelson George Publicover, Spencer James Connaughton, Nathan Lord, Peter Milford, Ivan Maleev
-
Patent number: 10481682Abstract: An electronic system generates at a display virtual writing corresponding to tracked motion of the tip of a pointer with respect to a surface based on proximity of the tip of the pointer to the surface and the gaze of a user's eye. The electronic system determines the location and motion of the tip of the pointer with respect to the surface based on images captured by scene cameras, and determines the focus and gaze direction of the user's eye based on images captured by a user-facing camera. By generating virtual writing at the display corresponding to tracked motion of the tip of the pointer based on proximity of the tip of the pointer to the surface and based on the focus and gaze direction of the user's eye, the electronic system can enable virtual writing and associated collaboration services without the need for a specialized writing surface or pointer.Type: GrantFiled: March 29, 2017Date of Patent: November 19, 2019Assignee: GOOGLE LLCInventors: Lewis James Marggraff, Nelson G. Publicover, Spencer James Connaughton
-
Patent number: 10412840Abstract: Systems and methods are provided to produce electromechanical interconnections within integrated circuits (ICs), printed circuit boards (PCBs) and between PCBs and other electronic components such as resistors, capacitors and integrated circuits. Elements include so-called “smart pins” or “neuro-pins” that facilitate electrical pathways in the dimension normal to the plane of a PCB. Smart pins or neuro-pins may be inserted using automated processes that do not require the high temperatures normally associated with soldering. Resultant circuits generally contain a large number of layers that are more compact and more readily constructed compared with conventional PCB-based circuitry.Type: GrantFiled: October 31, 2016Date of Patent: September 10, 2019Assignee: DOTSLAM, INC.Inventors: Lewis James Marggraff, Nelson G. Publicover, Blake Marggraff, Edward D. Krent, Marc M. Thomas
-
Publication number: 20190179418Abstract: Systems are presented herein, which may be implemented in a wearable device. The system is designed to allow a user to edit media images captured with the wearable device. The system employs eye tracking data to control various editing functions, whether prior to the time of capture, during the time of capture, or after the time of capture. Also presented are methods for determining which sections or regions of media images may be of greater interest to a user or viewer. The method employs eye tracking data to assign saliency to captured media. In both the system and the method, eye tracking data may be combined with data from additional sensors in order to enhance operation.Type: ApplicationFiled: July 6, 2018Publication date: June 13, 2019Inventors: Lewis James MARGGRAFF, Eliot Francis DRAKE
-
Publication number: 20180275753Abstract: An electronic system tracks a user's gaze to rapidly transport a cursor to a location within a focal region of the user's eye. The electronic system transports the cursor from an initial location across one or more displays to a new location in response to detecting one or more saccadic and/or vergence movements of the user's eye or in response to a signal indicating that the user desires to move the cursor to a new location within a focal region of the user's eye or eyes. In some embodiments, the electronic system moves the cursor to the new location along a trajectory wherein the cursor is visible along at least a portion of the trajectory, enabling the user to find the cursor more easily.Type: ApplicationFiled: February 15, 2018Publication date: September 27, 2018Inventors: Nelson G. Publicover, Lewis James Marggraff, Spencer James Connaughton
-
Patent number: 10025379Abstract: Devices and methods are provided for eye-tracking, e.g., including a freeform optical assembly and/or a modular design. In an exemplary embodiment, a device and method are provided that includes a wearable device on a user's head, the wearable device including a scene camera oriented to capture images of the user's surroundings. The user may perform a predetermined action with the user's eye to activate a photo feature of the wearable device, gaze at a region within the user's surroundings, the wearable device determining a focal point and limited field-of-view for the camera imaging field based on the center point, and activate the camera to capture an image of the limited field-of-view centered around the focal point.Type: GrantFiled: December 6, 2013Date of Patent: July 17, 2018Assignee: Google LLCInventors: Eliot Francis Drake, Gholamreza Amayeh, Angelique Kano, Dave Le Blanc, Zhiming Liu, Lewis James Marggraff, Rory Pierce, Nelson G. Publicover, Christopher N. Spitler, Michael Vacchina
-
Publication number: 20180173319Abstract: Systems are presented herein, which may be implemented in a wearable device. The system is designed to allow a user to edit media images captured with the wearable device. The system employs eye tracking data to control various editing functions, whether prior to the time of capture, during the time of capture, or after the time of capture. Also presented are methods for determining which sections or regions of media images may be of greater interest to a user or viewer. The method employs eye tracking data to assign saliency to captured media. In both the system and the method, eye tracking data may be combined with data from additional sensors in order to enhance operation.Type: ApplicationFiled: December 8, 2017Publication date: June 21, 2018Inventors: Lewis James MARGGRAFF, Eliot Francis DRAKE
-
Patent number: 9870060Abstract: Systems are presented herein, which may be implemented in a wearable device. The system is designed to allow a user to edit media images captured with the wearable device. The system employs eye tracking data to control various editing functions, whether prior to the time of capture, during the time of capture, or after the time of capture. Also presented are methods for determining which sections or regions of media images may be of greater interest to a user or viewer. The method employs eye tracking data to assign saliency to captured media. In both the system and the method, eye tracking data may be combined with data from additional sensors in order to enhance operation.Type: GrantFiled: December 31, 2014Date of Patent: January 16, 2018Assignee: Google LLCInventors: Lewis James Marggraff, Eliot Francis Drake
-
Publication number: 20180011533Abstract: Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system may be included within unobtrusive headwear that performs eye tracking and controls screen display. The system may also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.Type: ApplicationFiled: August 15, 2016Publication date: January 11, 2018Inventors: Lewis James Marggraff, Nelson George Publicover, Spencer James Connaughton, Nathan Lord, Peter Milford, Ivan Maleev
-
Patent number: 9823744Abstract: Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system can be included within unobtrusive headwear that performs eye tracking and controls screen display. The system can also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.Type: GrantFiled: November 2, 2015Date of Patent: November 21, 2017Assignee: Google Inc.Inventors: Nelson George Publicover, Lewis James Marggraff, Eliot Francis Drake, Spencer James Connaughton
-
Publication number: 20170285742Abstract: An electronic system generates at a display virtual writing corresponding to tracked motion of the tip of a pointer with respect to a surface based on proximity of the tip of the pointer to the surface and the gaze of a user's eye. The electronic system determines the location and motion of the tip of the pointer with respect to the surface based on images captured by scene cameras, and determines the focus and gaze direction of the user's eye based on images captured by a user-facing camera. By generating virtual writing at the display corresponding to tracked motion of the tip of the pointer based on proximity of the tip of the pointer to the surface and based on the focus and gaze direction of the user's eye, the electronic system can enable virtual writing and associated collaboration services without the need for a specialized writing surface or pointer.Type: ApplicationFiled: March 29, 2017Publication date: October 5, 2017Inventors: Lewis James Marggraff, Nelson G. Publicover, Spencer James Connaughton
-
Publication number: 20170123492Abstract: Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system may be included within unobtrusive headwear that performs eye tracking and controls screen display. The system may also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.Type: ApplicationFiled: August 15, 2016Publication date: May 4, 2017Inventors: Lewis James Marggraff, Nelson George Publicover, Spencer James Connaughton, Nathan Lord, Peter Milford, Ivan Maleev
-
Patent number: 9600069Abstract: Apparatus, systems, and methods are provided for substantially continuous biometric identification (CBID) of an individual using eye signals in real time. The apparatus is included within a wearable computing device with identification of the device wearer based on iris recognition within one or more cameras directed at one or both eyes, and/or other physiological, anatomical and/or behavioral measures. Verification of device user identity can be used to enable or disable the display of secure information. Identity verification can also be included within information that is transmitted from the device in order to determine appropriate security measures by remote processing units. The apparatus may be incorporated within wearable computing that performs other functions including vision correction, head-mounted display, viewing the surrounding environment using scene camera(s), recording audio data via a microphone, and/or other sensing equipment.Type: GrantFiled: May 9, 2015Date of Patent: March 21, 2017Assignee: Google Inc.Inventors: Nelson George Publicover, Lewis James Marggraff