Patents by Inventor Robert Konrad
Robert Konrad has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12124624Abstract: An event camera system for pupil detection may include a camera assembly and a controller, and may also include one or more off-axis light sources. The camera assembly may include one or more infrared (IR) light sources and an event camera. The one or more IR light sources are configured to emit pulses of IR light along an optical path toward an eyebox. The IR light is reflected from an eye in the eyebox, and the reflected light propagates back along the optical path toward the event camera for detection. The controller is configured to determine an orientation of the eye using data output from the event camera.Type: GrantFiled: November 15, 2022Date of Patent: October 22, 2024Assignee: Sesame AI, Inc.Inventors: Kevin Boyle, Robert Konrad, Nitish Padmanaban
-
Publication number: 20240256529Abstract: A device comprising a processor and a memory may be configured to perform the techniques described in this disclosure. The processor may present, via one or more portions of a first, second, or third user interface, one or more of either an interactive text box or interactive search bar in which a user may enter data indicative of a current input, an interactive log of previous inputs, a graphical representation of result data obtained responsive to the data indicative of the current input, one or more datasets, and at least a portion of the multi-dimensional data included in the one or more datasets. Various portions of the various user interfaces are separately scrollable but coupled such that interactions in the various portions of the various user interfaces synchronize the various portions of the various user interfaces. The memory is configured to store the data indicative of the current input.Type: ApplicationFiled: January 26, 2023Publication date: August 1, 2024Inventors: Jignesh Patel, Rogers Jeffrey Leo John, Robert Konrad Claus, Jiatong Li, Sulong Zhou, Yukiko Suzuki
-
Publication number: 20240160012Abstract: An event camera system for pupil detection may include a camera assembly and a controller, and may also include one or more off-axis light sources. The camera assembly may include one or more infrared (IR) light sources and an event camera. The one or more IR light sources are configured to emit pulses of IR light along an optical path toward an eyebox. The IR light is reflected from an eye in the eyebox, and the reflected light propagates back along the optical path toward the event camera for detection. The controller is configured to determine an orientation of the eye using data output from the event camera.Type: ApplicationFiled: November 15, 2022Publication date: May 16, 2024Inventors: Kevin Boyle, Robert Konrad, Nitish Padmanaban
-
Publication number: 20240098360Abstract: A tracking system for object detection and tracking. The system may include a plurality of light sources, a differential camera, and a controller. The plurality of light sources is positioned at different locations on a device and are configured to emit pulses of light that illuminate an object. The differential camera has an optical axis, and at least some of the plurality of light sources are off-axis relative to the optical axis. The differential camera is configured to detect a change in brightness of the object caused in part by one or more of the pulses of light, and asynchronously output data samples corresponding to the detected change in brightness. The controller is configured to track the object based in part on the data samples output by the differential camera.Type: ApplicationFiled: September 15, 2023Publication date: March 21, 2024Inventors: Robert Konrad Konrad, Kevin Conlon Boyle, Gordon Wetzstein
-
Publication number: 20240094811Abstract: A differential camera system for object tracking. The system includes a co-aligned light source camera assembly (LSCA) and a controller. The co-aligned LSCA includes a light source and a differential camera sensor. The light source is configured to emit light along an optical path that is directed towards an eye box including an eye of a user. The differential camera sensor is configured to detect a change in brightness of the eye caused in part by the emitted light, asynchronously output data samples corresponding to the detected change in brightness, wherein the optical path is substantially co-aligned with an optical path of the differential camera sensor. The controller is configured to identify a pupil of the eye based on data samples output from the differential camera sensor resulting from the emitted light, and determine a gaze location of the user based in part on the identified pupil.Type: ApplicationFiled: September 15, 2023Publication date: March 21, 2024Inventors: Robert Konrad Konrad, Kevin Conlon Boyle, Gordon Wetzstein, Nitish Padmanaban, John Gabriel Buckmaster
-
Publication number: 20240045215Abstract: An augmented reality display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity is selected using a light source that outputs light for different images from different locations, with spatial differences in the locations of the light output providing differences in the paths that the light takes to the eye, which in turn provide different amounts of parallax disparity.Type: ApplicationFiled: October 19, 2023Publication date: February 8, 2024Inventors: Michael Anthony KLUG, Robert KONRAD, Gordon WETZSTEIN, Brian T. SCHOWENGERDT, Michal Beau Dennison VAUGHN
-
Patent number: 11835724Abstract: An augmented reality display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity is selected using a light source that outputs light for different images from different locations, with spatial differences in the locations of the light output providing differences in the paths that the light takes to the eye, which in turn provide different amounts of parallax disparity.Type: GrantFiled: February 13, 2023Date of Patent: December 5, 2023Assignee: Magic Leap, Inc.Inventors: Michael Anthony Klug, Robert Konrad, Gordon Wetzstein, Brian T. Schowengerdt, Michal Beau Dennison Vaughn
-
Publication number: 20230240606Abstract: Embodiments are related to a system with a headset capable of monitoring psychomotor performance of a user of the headset based on eyelid tracking information. The headset includes a sensor assembly coupled to a frame of the headset, and a transceiver coupled to the sensor assembly. The sensor assembly is configured to track an eyelid of an eye of the user and capture eyelid tracking information. The transceiver is configured to obtain the eyelid tracking information from the sensor assembly and communicate the eyelid tracking information to a secondary device coupled to the headset for processing the eyelid tracking information and determination of sleep information for the user based in part on the processed eyelid tracking information.Type: ApplicationFiled: January 19, 2023Publication date: August 3, 2023Inventors: Kevin Boyle, Robert Konrad, Nitish Padmanaban
-
Publication number: 20230195220Abstract: An event camera system for pupil detection may include a camera assembly and a controller, and may also include one or more off-axis light sources. The camera assembly may include one or more infrared (IR) light sources and an event camera. The one or more IR light sources are configured to emit pulses of IR light along an optical path toward an eyebox. The IR light is reflected from an eye in the eyebox, and the reflected light propagates back along the optical path toward the event camera for detection. The controller is configured to determine an orientation of the eye using data output from the event camera.Type: ApplicationFiled: November 15, 2022Publication date: June 22, 2023Inventors: Kevin Boyle, Robert Konrad, Nitish Padmanaban
-
Publication number: 20230194879Abstract: An augmented reality display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity is selected using a light source that outputs light for different images from different locations, with spatial differences in the locations of the light output providing differences in the paths that the light takes to the eye, which in turn provide different amounts of parallax disparity.Type: ApplicationFiled: February 13, 2023Publication date: June 22, 2023Inventors: Michael Anthony Klug, Robert Konrad, Gordon Wetzstein, Brian T. Schowengerdt, Michal Beau Dennison Vaughn
-
Publication number: 20230185798Abstract: A device configured to perform data analytics comprising a memory and a processor may be configured to perform the techniques described in this disclosure. The memory may store multi-dimensional data. The processor may receive a sequence of inputs defining a recipe for analyzing the multi-dimensional data according to a language sub-surface specifying a natural language containment hierarchy defining a grammar for a natural language as a hierarchical arrangement of a plurality of language sub-surfaces. The processor may also receive data indicative of a summarized narration of the recipe and parameterize a field of the summarized narration to insert a user adjustable parameter that enables manipulation of the underlying recipe and obtain a parameterized summary that includes the user adjustable parameter. The processor may next present, via a first user interface, the parameterized summary.Type: ApplicationFiled: December 9, 2021Publication date: June 15, 2023Inventors: Jignesh Patel, Robert Konrad Claus, Amos Kendall, Rogers Jeffrey Leo John, Ushmal Ramesh, Jiatong Li
-
Patent number: 11662574Abstract: A device includes a camera assembly and a controller. The camera assembly is configured to capture images of both eyes of a user. Using the captured images, the controller determines a location for each pupil of each eye of the user. The determined pupil locations and captured images are used to determine eye tracking parameters which are used to compute values of eye tracking functions. With the computed values and a model that maps the eye tracking functions to gaze depths, a gaze depth of the user is determined. An action is performed based on the determined gaze depth.Type: GrantFiled: November 9, 2021Date of Patent: May 30, 2023Assignee: Zinn Labs, Inc.Inventors: Kevin Boyle, Robert Konrad, Nitish Padmanaban, Gordon Wetzstein
-
Publication number: 20230142618Abstract: Embodiments relate to an eye tracking system. A headset of the system includes an eye tracking sensor that captures eye tracking data indicating positions and movements of a user's eye. A controller (e.g., in the headset) of the tracking system analyzes eye tracking data from the sensors to determine eye tracking feature values of the eye during a time period. The controller determines an activity of the user during the time period based on the eye tracking feature values. The controller updates an activity history of the user with the determined activity.Type: ApplicationFiled: November 3, 2022Publication date: May 11, 2023Inventors: Kevin Conlon Boyle, Robert Konrad Konrad, Nitish Padmanaban
-
Patent number: 11625095Abstract: Embodiments are related to a plurality of gaze sensors embedded into a frame of a headset for detection of a gaze vector of a user wearing the headset and user's control at the headset. The gaze vector for an eye of the user can be within a threshold distance from one of the gaze sensors. By monitoring signals detected by the gaze sensors, it can be determined that the gaze vector is within the threshold distance from the gaze sensor. Based on this determination, at least one action associated with the headset is initiated.Type: GrantFiled: January 20, 2022Date of Patent: April 11, 2023Assignee: Zinn Labs, Inc.Inventors: Robert Konrad, Kevin Boyle, Nitish Padmanaban, Gordon Wetzstein
-
Patent number: 11614628Abstract: An augmented reality display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity is selected using a light source that outputs light for different images from different locations, with spatial differences in the locations of the light output providing differences in the paths that the light takes to the eye, which in turn provide different amounts of parallax disparity.Type: GrantFiled: January 21, 2022Date of Patent: March 28, 2023Assignee: Magic Leap, Inc.Inventors: Michael Anthony Klug, Robert Konrad, Gordon Wetzstein, Brian T. Schowengerdt, Michal Beau Dennison Vaughn
-
Patent number: 11543883Abstract: An event camera system for pupil detection may include a camera assembly and a controller, and may also include one or more off-axis light sources. The camera assembly may include one or more infrared (IR) light sources and an event camera. The one or more IR light sources are configured to emit pulses of IR light along an optical path toward an eyebox. The IR light is reflected from an eye in the eyebox, and the reflected light propagates back along the optical path toward the event camera for detection. The controller is configured to determine an orientation of the eye using data output from the event camera.Type: GrantFiled: December 16, 2021Date of Patent: January 3, 2023Assignee: Zinn Labs, Inc.Inventors: Kevin Boyle, Robert Konrad, Nitish Padmanaban
-
Publication number: 20220334709Abstract: A device configured to perform data analytics comprising a memory and a processor may be configured to perform the techniques described in this disclosure. The memory may store multi-dimensional data. The processor may present, via a user interface, a graphical representation of a format for visually representing the multi-dimensional data. The processor may also receive, via the user interface, a selection of an aspect of one or more aspects of the graphical representation of the format. The processor may further receive, via the user interface and for the aspect of the one or more aspects of the graphical representation of the format for visually representing the multi-dimensional data, an indication of a dimension of the multi-dimensional data, and associate the dimension to the aspect to generate a visual representation of the multi-dimensional data. The processor may then present, via the user interface, the visual representation of the multi-dimensional data.Type: ApplicationFiled: July 23, 2021Publication date: October 20, 2022Inventors: Jiatong Li, Jignesh Patel, Rogers Jeffrey Leo John, Robert Konrad Claus, Nathaniel John Goethel
-
Publication number: 20220238220Abstract: Embodiments are related to a headset integrated into a healthcare platform. The headset comprises one or more sensors embedded into a frame of the headset, a controller coupled to the one or more sensors, and a transceiver coupled to the controller. The one or more sensors capture health information data for a user wearing the headset. The controller pre-processes at least a portion of the captured health information data to generate a pre-processed portion of the health information data. The transceiver communicates the health information data and the pre-processed portion of health information data to an intermediate device communicatively coupled to the headset. The intermediate device processes at least one of the health information data and the pre-processed portion of health information data to generate processed health information data for a health-related diagnostic of the user.Type: ApplicationFiled: January 20, 2022Publication date: July 28, 2022Inventors: Robert Konrad, Kevin Boyle, Nitish Padmanaban, Gordon Wetzstein
-
Publication number: 20220236796Abstract: Embodiments are related to a plurality of gaze sensors embedded into a frame of a headset for detection of a gaze vector of a user wearing the headset and user's control at the headset. The gaze vector for an eye of the user can be within a threshold distance from one of the gaze sensors. By monitoring signals detected by the gaze sensors, it can be determined that the gaze vector is within the threshold distance from the gaze sensor. Based on this determination, at least one action associated with the headset is initiated.Type: ApplicationFiled: January 20, 2022Publication date: July 28, 2022Inventors: Robert Konrad, Kevin Boyle, Nitish Padmanaban, Gordon Wetzstein
-
Publication number: 20220197376Abstract: An event camera system for pupil detection may include a camera assembly and a controller, and may also include one or more off-axis light sources. The camera assembly may include one or more infrared (IR) light sources and an event camera. The one or more IR light sources are configured to emit pulses of IR light along an optical path toward an eyebox. The IR light is reflected from an eye in the eyebox, and the reflected light propagates back along the optical path toward the event camera for detection. The controller is configured to determine an orientation of the eye using data output from the event camera.Type: ApplicationFiled: December 16, 2021Publication date: June 23, 2022Inventors: Kevin Boyle, Robert Konrad, Nitish Padmanaban