Patents by Inventor Dan Witzner Hansen
Dan Witzner Hansen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11138741Abstract: A method and system for monitoring the motion of one or both eyes, includes capturing a sequence of overlapping images of a subject's face including an eye and the corresponding non-eye region; identifying a plurality of keypoints in each image; mapping corresponding keypoints in two or more images of the sequence; assigning the keypoints to the eye and to the corresponding non-eye region; calculating individual velocities of the corresponding keypoints in the eye and the corresponding non-eye region to obtain a distribution of velocities; extracting at least one velocity measured for the eye and at least one velocity measured for the corresponding non-eye region; calculating the eye-in-head velocity for the eye based upon the measured velocity for the eye and the measured velocity for the corresponding non-eye region; and calculating the eye-in-head position based upon the eye-in-head velocity.Type: GrantFiled: May 26, 2017Date of Patent: October 5, 2021Assignee: Rochester Institute of TechnologyInventors: Jeff B. Pelz, Dan Witzner Hansen
-
Publication number: 20200320718Abstract: A method and system for monitoring the motion of one or both eyes, includes capturing a sequence of overlapping images of a subject's face including an eye and the corresponding non-eye region; identifying a plurality of keypoints in each image; mapping corresponding keypoints in two or more images of the sequence; assigning the keypoints to the eye and to the corresponding non-eye region; calculating individual velocities of the corresponding keypoints in the eye and the corresponding non-eye region to obtain a distribution of velocities; extracting at least one velocity measured for the eye and at least one velocity measured for the corresponding non-eye region; calculating the eye-in-head velocity for the eye based upon the measured velocity for the eye and the measured velocity for the corresponding non-eye region; and calculating the eye-in-head position based upon the eye-in-head velocity.Type: ApplicationFiled: May 26, 2017Publication date: October 8, 2020Applicant: Rochester Institute of TechnologyInventors: Jeff B. Pelz, Dan Witzner Hansen
-
Patent number: 10234940Abstract: A gaze tracker and a computer-implemented method for gaze tracking, comprising the steps of: recording video images of a being's eye such that an eye pupil and a glint on the eye ball caused by a light source ( ) are recorded; processing the video images to compute an offset between the position of the predetermined spatial feature and a predetermined position with respect to the glint; by means of the light source such as a display, emitting light from a light pattern at a location selected among a multitude of preconfigured locations of light patterns towards the being's eye; wherein the location is controlled by a feedback signal; controlling the location of the light pattern from one location to another location among the predefined locations of light patterns, in response to the offset, such that the predetermined position with respect to the glint caused by the light source tracks the predetermined spatial feature of the being's eye; wherein the above steps are repeated to establish a control loop withType: GrantFiled: February 4, 2016Date of Patent: March 19, 2019Assignee: ITU Business Development A/SInventors: Diako Mardanbegi, Dan Witzner Hansen
-
Publication number: 20180239423Abstract: A gaze tracker and a computer-implemented method for gaze tracking, comprising the steps of: recording video images of a being's eye such that an eye pupil and a glint on the eye ball caused by a light source ( ) are recorded; processing the video images to compute an offset between the position of the predetermined spatial feature and a predetermined position with respect to the glint; by means of the light source such as a display, emitting light from a light pattern at a location selected among a multitude of preconfigured locations of light patterns towards the being's eye; wherein the location is controlled by a feedback signal; controlling the location of the light pattern from one location to another location among the predefined locations of light patterns, in response to the offset, such that the predetermined position with respect to the glint caused by the light source tracks the predetermined spatial feature of the being's eye; wherein the above steps are repeated to establish a control loop withType: ApplicationFiled: February 4, 2016Publication date: August 23, 2018Inventors: Diako MARDANBEGI, Dan Witzner HANSEN
-
Publication number: 20170123491Abstract: A computer-implemented method of communicating via interaction with a user-interface based on a person's gaze and gestures, comprising: computing an estimate of the person's gaze comprising computing a point-of-regard on a display through which the person observes a scene in front of him; by means of a scene camera, capturing a first image of a scene in front of the person's head (and at least partially visible on the display) and computing the location of an object coinciding with the person's gaze; by means of the scene camera, capturing at least one further image of the scene in front of the person's head, and monitoring whether the gaze dwells on the recognised object; and while gaze dwells on the recognised object: firstly, displaying a user interface element, with a spatial expanse, on the display face in a region adjacent to the point-of-regard; and secondly, during movement of the display, awaiting and detecting the event that the point-of-regard coincides with the spatial expanse of the displayed useType: ApplicationFiled: March 16, 2015Publication date: May 4, 2017Inventors: Dan Witzner Hansen, Diako Mardanbegi
-
Patent number: 9405364Abstract: A method of filtering glints by processing an image of a user's cornea to obtain coordinates of desired glints from a configuration of light sources, comprising processing an image, in a first image space, of a user's cornea to determine coordinates of respective multiple positions of glints; and iteratively: selecting from the coordinates a first and a second set of coordinates; computing from the first set of coordinates a transformation that transforms the first set of coordinates into first coordinates of a predetermined spatial configuration; and testing whether the transformation transforms also the second set into positions that match second positions of the predetermined configuration. The coordinates of the desired glints are selected as those first and second sets which are transformed into coordinates that match the first and second coordinates of the predetermined configuration. The method is based on a geometrical homography and is expedient for robust gaze estimation in connection with e.g.Type: GrantFiled: October 29, 2010Date of Patent: August 2, 2016Assignee: IT-Universitetet i KøbenhavnInventor: Dan Witzner Hansen
-
Patent number: 9398848Abstract: A method of performing eye gaze tracking of at least one eye of a user, by determining the position of the center of the eye, said method comprising the steps of: detecting the position of at least three reflections on said eye, transforming said positions to spanning a normalized coordinate system spanning a frame of reference, detecting the position of said center of the eye relative to the position of said reflections and transforming this position to said normalized coordinate system, tracking the eye gaze by tracking the movement of said eye in said normalized coordinate system. Thereby calibration of a camera, such as knowledge of the exact position and zoom level of the camera, is avoided. Further, it is not necessary to know the position of light sources. This results in a much more flexible and user friendly system.Type: GrantFiled: July 8, 2008Date of Patent: July 26, 2016Assignee: IT-UNIVERSITY OF COPENHAGENInventor: Dan Witzner Hansen
-
Publication number: 20140002349Abstract: A method of filtering glints by processing an image of a user's cornea to obtain coordinates of desired glints from a configuration of light sources, comprising processing an image, in a first image space, of a user's cornea to determine coordinates of respective multiple positions of glints; and iteratively: selecting from the coordinates a first and a second set of coordinates; computing from the first set of coordinates a transformation that transforms the first set of coordinates into first coordinates of a predetermined spatial configuration; and testing whether the transformation transforms also the second set into positions that match second positions of the predetermined configuration. The coordinates of the desired glints are selected as those first and second sets which are transformed into coordinates that match the first and second coordinates of the predetermined configuration. The method is based on a geometrical homography and is expedient for robust gaze estimation in connection with e.g.Type: ApplicationFiled: October 29, 2010Publication date: January 2, 2014Applicant: IT-UNIVERSITETET I KØBENHAVNInventor: Dan Witzner Hansen
-
Publication number: 20110182472Abstract: This invention relates to a method of performing eye gaze tracking of at least one eye of a user, by determining the position of the center of the eye, said method comprising the steps of: detecting the position of at least three reflections on said eye, transforming said positions to spanning a normalized coordinate system spanning a frame of reference, wherein said transformation is performed based on a bilinear transformation or a non linear transformation e.g. a möbius transformation or a homographic transformation, detecting the position of said center of the eye relative to the position of said reflections and transforming this position to said normalized coordinate system, tracking the eye gaze by tracking the movement of said eye in said normalized coordinate system. Thereby calibration of a camera, such as knowledge of the exact position and zoom level of the camera, is avoided. Further, it is not necessary to know the position of light sources.Type: ApplicationFiled: July 8, 2008Publication date: July 28, 2011Inventor: Dan Witzner Hansen