Patents by Inventor DIAKO MARDANBEGI

DIAKO MARDANBEGI has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10234940
    Abstract: A gaze tracker and a computer-implemented method for gaze tracking, comprising the steps of: recording video images of a being's eye such that an eye pupil and a glint on the eye ball caused by a light source ( ) are recorded; processing the video images to compute an offset between the position of the predetermined spatial feature and a predetermined position with respect to the glint; by means of the light source such as a display, emitting light from a light pattern at a location selected among a multitude of preconfigured locations of light patterns towards the being's eye; wherein the location is controlled by a feedback signal; controlling the location of the light pattern from one location to another location among the predefined locations of light patterns, in response to the offset, such that the predetermined position with respect to the glint caused by the light source tracks the predetermined spatial feature of the being's eye; wherein the above steps are repeated to establish a control loop with
    Type: Grant
    Filed: February 4, 2016
    Date of Patent: March 19, 2019
    Assignee: ITU Business Development A/S
    Inventors: Diako Mardanbegi, Dan Witzner Hansen
  • Publication number: 20180284886
    Abstract: A computer-implemented method and a computer of recovering a visual event, comprising: by means of a graphical user interface, the contents of a viewport is displayed to a user as the viewport is progressively moved across graphical portions of a visual media object; while the contents of the viewport is displayed, recording an eye movement signal that is indicative of the movements of a user's at least one eye, classifying temporal sections of the eye movement signal into at least a class of long slow-phase OKN eye movements occurring among short slow-phase eye movements; setting a synchronization marker at least for a first occurrence of a temporal section classified as a smooth pursuit eye movement; wherein the synchronization marker comprises a link to or impression information of the contents of the viewport at the point in time when the first occurrence of a smooth pursuit eye movement occurred; via the synchronization marker, recovering the impression information or the contents of the viewport that wa
    Type: Application
    Filed: September 26, 2016
    Publication date: October 4, 2018
    Inventors: Diako MARDANBEGI, Shahram JALALINIYA, John Paulin HANSEN
  • Publication number: 20180239423
    Abstract: A gaze tracker and a computer-implemented method for gaze tracking, comprising the steps of: recording video images of a being's eye such that an eye pupil and a glint on the eye ball caused by a light source ( ) are recorded; processing the video images to compute an offset between the position of the predetermined spatial feature and a predetermined position with respect to the glint; by means of the light source such as a display, emitting light from a light pattern at a location selected among a multitude of preconfigured locations of light patterns towards the being's eye; wherein the location is controlled by a feedback signal; controlling the location of the light pattern from one location to another location among the predefined locations of light patterns, in response to the offset, such that the predetermined position with respect to the glint caused by the light source tracks the predetermined spatial feature of the being's eye; wherein the above steps are repeated to establish a control loop with
    Type: Application
    Filed: February 4, 2016
    Publication date: August 23, 2018
    Inventors: Diako MARDANBEGI, Dan Witzner HANSEN
  • Patent number: 10037312
    Abstract: A gaze annotation method for an image includes: receiving a user command to capture and display a captured image; receiving another user command to create an annotation for the displayed image; in response to the second user command, receiving from the gaze tracking device a point-of-regard estimating a user's gaze in the displayed image; displaying an annotation anchor on the image proximate to the point-of-regard; and receiving a spoken annotation from the user and associating the spoken annotation with the annotation anchor. A gaze annotation method for a real-world scene includes: receiving a field of view and location information; receiving from the gaze tracking device a point-of-regard from the user located within the field of view; capturing and displaying a captured image of the field of view; while capturing the image, receiving a spoken annotation from the user; and displaying an annotation anchor on the image.
    Type: Grant
    Filed: March 24, 2015
    Date of Patent: July 31, 2018
    Assignee: FUJI XEROX CO., LTD.
    Inventors: Diako Mardanbegi, Pernilla Qvarfordt
  • Publication number: 20170123491
    Abstract: A computer-implemented method of communicating via interaction with a user-interface based on a person's gaze and gestures, comprising: computing an estimate of the person's gaze comprising computing a point-of-regard on a display through which the person observes a scene in front of him; by means of a scene camera, capturing a first image of a scene in front of the person's head (and at least partially visible on the display) and computing the location of an object coinciding with the person's gaze; by means of the scene camera, capturing at least one further image of the scene in front of the person's head, and monitoring whether the gaze dwells on the recognised object; and while gaze dwells on the recognised object: firstly, displaying a user interface element, with a spatial expanse, on the display face in a region adjacent to the point-of-regard; and secondly, during movement of the display, awaiting and detecting the event that the point-of-regard coincides with the spatial expanse of the displayed use
    Type: Application
    Filed: March 16, 2015
    Publication date: May 4, 2017
    Inventors: Dan Witzner Hansen, Diako Mardanbegi
  • Publication number: 20160283455
    Abstract: A gaze annotation method for an image includes: receiving a user command to capture and display a captured image; receiving another user command to create an annotation for the displayed image; in response to the second user command, receiving from the gaze tracking device a point-of-regard estimating a user's gaze in the displayed image; displaying an annotation anchor on the image proximate to the point-of-regard; and receiving a spoken annotation from the user and associating the spoken annotation with the annotation anchor. A gaze annotation method for a real-world scene includes: receiving a field of view and location information; receiving from the gaze tracking device a point-of-regard from the user located within the field of view; capturing and displaying a captured image of the field of view; while capturing the image, receiving a spoken annotation from the user; and displaying an annotation anchor on the image.
    Type: Application
    Filed: March 24, 2015
    Publication date: September 29, 2016
    Inventors: DIAKO MARDANBEGI, PERNILLA QVARFORDT