Patents by Inventor Dan Witzner Hansen

Dan Witzner Hansen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11138741
    Abstract: A method and system for monitoring the motion of one or both eyes, includes capturing a sequence of overlapping images of a subject's face including an eye and the corresponding non-eye region; identifying a plurality of keypoints in each image; mapping corresponding keypoints in two or more images of the sequence; assigning the keypoints to the eye and to the corresponding non-eye region; calculating individual velocities of the corresponding keypoints in the eye and the corresponding non-eye region to obtain a distribution of velocities; extracting at least one velocity measured for the eye and at least one velocity measured for the corresponding non-eye region; calculating the eye-in-head velocity for the eye based upon the measured velocity for the eye and the measured velocity for the corresponding non-eye region; and calculating the eye-in-head position based upon the eye-in-head velocity.
    Type: Grant
    Filed: May 26, 2017
    Date of Patent: October 5, 2021
    Assignee: Rochester Institute of Technology
    Inventors: Jeff B. Pelz, Dan Witzner Hansen
  • Publication number: 20200320718
    Abstract: A method and system for monitoring the motion of one or both eyes, includes capturing a sequence of overlapping images of a subject's face including an eye and the corresponding non-eye region; identifying a plurality of keypoints in each image; mapping corresponding keypoints in two or more images of the sequence; assigning the keypoints to the eye and to the corresponding non-eye region; calculating individual velocities of the corresponding keypoints in the eye and the corresponding non-eye region to obtain a distribution of velocities; extracting at least one velocity measured for the eye and at least one velocity measured for the corresponding non-eye region; calculating the eye-in-head velocity for the eye based upon the measured velocity for the eye and the measured velocity for the corresponding non-eye region; and calculating the eye-in-head position based upon the eye-in-head velocity.
    Type: Application
    Filed: May 26, 2017
    Publication date: October 8, 2020
    Applicant: Rochester Institute of Technology
    Inventors: Jeff B. Pelz, Dan Witzner Hansen
  • Patent number: 10234940
    Abstract: A gaze tracker and a computer-implemented method for gaze tracking, comprising the steps of: recording video images of a being's eye such that an eye pupil and a glint on the eye ball caused by a light source ( ) are recorded; processing the video images to compute an offset between the position of the predetermined spatial feature and a predetermined position with respect to the glint; by means of the light source such as a display, emitting light from a light pattern at a location selected among a multitude of preconfigured locations of light patterns towards the being's eye; wherein the location is controlled by a feedback signal; controlling the location of the light pattern from one location to another location among the predefined locations of light patterns, in response to the offset, such that the predetermined position with respect to the glint caused by the light source tracks the predetermined spatial feature of the being's eye; wherein the above steps are repeated to establish a control loop with
    Type: Grant
    Filed: February 4, 2016
    Date of Patent: March 19, 2019
    Assignee: ITU Business Development A/S
    Inventors: Diako Mardanbegi, Dan Witzner Hansen
  • Publication number: 20180239423
    Abstract: A gaze tracker and a computer-implemented method for gaze tracking, comprising the steps of: recording video images of a being's eye such that an eye pupil and a glint on the eye ball caused by a light source ( ) are recorded; processing the video images to compute an offset between the position of the predetermined spatial feature and a predetermined position with respect to the glint; by means of the light source such as a display, emitting light from a light pattern at a location selected among a multitude of preconfigured locations of light patterns towards the being's eye; wherein the location is controlled by a feedback signal; controlling the location of the light pattern from one location to another location among the predefined locations of light patterns, in response to the offset, such that the predetermined position with respect to the glint caused by the light source tracks the predetermined spatial feature of the being's eye; wherein the above steps are repeated to establish a control loop with
    Type: Application
    Filed: February 4, 2016
    Publication date: August 23, 2018
    Inventors: Diako MARDANBEGI, Dan Witzner HANSEN
  • Publication number: 20170123491
    Abstract: A computer-implemented method of communicating via interaction with a user-interface based on a person's gaze and gestures, comprising: computing an estimate of the person's gaze comprising computing a point-of-regard on a display through which the person observes a scene in front of him; by means of a scene camera, capturing a first image of a scene in front of the person's head (and at least partially visible on the display) and computing the location of an object coinciding with the person's gaze; by means of the scene camera, capturing at least one further image of the scene in front of the person's head, and monitoring whether the gaze dwells on the recognised object; and while gaze dwells on the recognised object: firstly, displaying a user interface element, with a spatial expanse, on the display face in a region adjacent to the point-of-regard; and secondly, during movement of the display, awaiting and detecting the event that the point-of-regard coincides with the spatial expanse of the displayed use
    Type: Application
    Filed: March 16, 2015
    Publication date: May 4, 2017
    Inventors: Dan Witzner Hansen, Diako Mardanbegi
  • Patent number: 9405364
    Abstract: A method of filtering glints by processing an image of a user's cornea to obtain coordinates of desired glints from a configuration of light sources, comprising processing an image, in a first image space, of a user's cornea to determine coordinates of respective multiple positions of glints; and iteratively: selecting from the coordinates a first and a second set of coordinates; computing from the first set of coordinates a transformation that transforms the first set of coordinates into first coordinates of a predetermined spatial configuration; and testing whether the transformation transforms also the second set into positions that match second positions of the predetermined configuration. The coordinates of the desired glints are selected as those first and second sets which are transformed into coordinates that match the first and second coordinates of the predetermined configuration. The method is based on a geometrical homography and is expedient for robust gaze estimation in connection with e.g.
    Type: Grant
    Filed: October 29, 2010
    Date of Patent: August 2, 2016
    Assignee: IT-Universitetet i København
    Inventor: Dan Witzner Hansen
  • Patent number: 9398848
    Abstract: A method of performing eye gaze tracking of at least one eye of a user, by determining the position of the center of the eye, said method comprising the steps of: detecting the position of at least three reflections on said eye, transforming said positions to spanning a normalized coordinate system spanning a frame of reference, detecting the position of said center of the eye relative to the position of said reflections and transforming this position to said normalized coordinate system, tracking the eye gaze by tracking the movement of said eye in said normalized coordinate system. Thereby calibration of a camera, such as knowledge of the exact position and zoom level of the camera, is avoided. Further, it is not necessary to know the position of light sources. This results in a much more flexible and user friendly system.
    Type: Grant
    Filed: July 8, 2008
    Date of Patent: July 26, 2016
    Assignee: IT-UNIVERSITY OF COPENHAGEN
    Inventor: Dan Witzner Hansen
  • Publication number: 20140002349
    Abstract: A method of filtering glints by processing an image of a user's cornea to obtain coordinates of desired glints from a configuration of light sources, comprising processing an image, in a first image space, of a user's cornea to determine coordinates of respective multiple positions of glints; and iteratively: selecting from the coordinates a first and a second set of coordinates; computing from the first set of coordinates a transformation that transforms the first set of coordinates into first coordinates of a predetermined spatial configuration; and testing whether the transformation transforms also the second set into positions that match second positions of the predetermined configuration. The coordinates of the desired glints are selected as those first and second sets which are transformed into coordinates that match the first and second coordinates of the predetermined configuration. The method is based on a geometrical homography and is expedient for robust gaze estimation in connection with e.g.
    Type: Application
    Filed: October 29, 2010
    Publication date: January 2, 2014
    Applicant: IT-UNIVERSITETET I KØBENHAVN
    Inventor: Dan Witzner Hansen
  • Publication number: 20110182472
    Abstract: This invention relates to a method of performing eye gaze tracking of at least one eye of a user, by determining the position of the center of the eye, said method comprising the steps of: detecting the position of at least three reflections on said eye, transforming said positions to spanning a normalized coordinate system spanning a frame of reference, wherein said transformation is performed based on a bilinear transformation or a non linear transformation e.g. a möbius transformation or a homographic transformation, detecting the position of said center of the eye relative to the position of said reflections and transforming this position to said normalized coordinate system, tracking the eye gaze by tracking the movement of said eye in said normalized coordinate system. Thereby calibration of a camera, such as knowledge of the exact position and zoom level of the camera, is avoided. Further, it is not necessary to know the position of light sources.
    Type: Application
    Filed: July 8, 2008
    Publication date: July 28, 2011
    Inventor: Dan Witzner Hansen