Patents by Inventor Marten Skogo
Marten Skogo has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20140247215Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. The delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves gaze while continuing the action (e.g., continued holding of the touchpad).Type: ApplicationFiled: March 3, 2014Publication date: September 4, 2014Applicant: Tobii Technology ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
-
Publication number: 20140168271Abstract: A graphics presentation apparatus including a display unit, an eye-tracking module, and a data output module. The eye-tracking module registers image data representing at least one eye of a user of the apparatus. Furthermore, the eye-tracking module determines, based on the registered image data, an orientation of the at least one eye relative to the display unit. Finally, in response thereto, the eye-tracking module generates a control signal controlling the data output module to produce visual content with such orientation on the display unit that a misalignment between the orientation of said at least one part and the orientation of the at least one eye of the user is minimized.Type: ApplicationFiled: December 11, 2013Publication date: June 19, 2014Applicant: Tobii Technology ABInventors: Aron Yu, Marten Skogo, Robert Gavelin, Per Nystedt
-
Publication number: 20140146156Abstract: A method of presenting gaze-point data of a subject detected by an eye-tracking unit includes presenting a test scene picture acquired by a camera unit, and displaying shapes on the test scene picture. The shapes represent momentary gaze points of the subject.Type: ApplicationFiled: January 13, 2014Publication date: May 29, 2014Applicant: TOBII TECHNOLOGY ABInventors: Johan Strombom, Marten Skogo, Per Nystedt, Simon Gustafsson, John Mikael Holtz Elvesjo, Peter Blixt
-
Publication number: 20140043227Abstract: A gaze tracking system, leaving a low power mode in response to an activation signal, captures an initial burst of eye pictures in short time by restricting the image area of a sensor, with the purpose of enabling an increased frame rate. Subsequent eye pictures are captured at normal frame rate. The first gaze point value is computed memorylessly based on the initial burst of eye pictures and no additional imagery, while subsequent values may be computed recursively by taking into account previous gaze point values or information from previous eye pictures. The restriction of the image area may be guided by a preliminary overview picture captured using the same or a different sensor. From the gaze point values, the system may derive a control signal to be supplied to a computer device with a visual display.Type: ApplicationFiled: August 8, 2013Publication date: February 13, 2014Applicant: Tobii Technology ABInventors: Mårten SKOGÖ, Anders OLSSON, John Mikael ELVESJÖ, Aron YU
-
Publication number: 20140009390Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.Type: ApplicationFiled: August 6, 2013Publication date: January 9, 2014Applicant: Tobii Technology ABInventors: Christoffer BJÖRKLUND, Henrik ESKILSSON, Magnus JACOBSON, Mårten SKOGÖ
-
Publication number: 20130321270Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.Type: ApplicationFiled: August 6, 2013Publication date: December 5, 2013Applicant: TOBII TECHNOLOGY ABInventors: Christoffer BJÖRKLUND, Henrik ESKILSSON, Magnus JACOBSON, Mårten SKOGÖ
-
Publication number: 20130326431Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.Type: ApplicationFiled: August 6, 2013Publication date: December 5, 2013Applicant: TOBII TECHNOLOGY ABInventors: Christoffer BJÖRKLUND, Henrik ESKILSSON, Magnus JACOBSON, Mårten SKOGÖ
-
Publication number: 20130318457Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.Type: ApplicationFiled: August 6, 2013Publication date: November 28, 2013Applicant: TOBII TECHNOLOGY ABInventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
-
Publication number: 20130106681Abstract: A personal computer system includes a visual display, an imaging device adapted to provide eye-tracking data by imaging the face of a viewer of the visual display, and an input device for accepting eye-tracking control data and other input data. The imaging device is switchable between at least an active mode, a ready mode and an idle mode, and the switching time from the idle mode to the active mode is longer than the switching time from the ready mode to the active mode. The eye-tracking data provided in the ready mode may include eye position but not eye orientation. The system may include a dedicated input device for forcing the imaging device into active mode by directly triggering an interrupt; alternatively, the dedicated input device forces it permanently into idle mode.Type: ApplicationFiled: October 27, 2011Publication date: May 2, 2013Applicant: TOBII TECHNOLOGY ABInventors: Henrik ESKILSSON, Mårten SKOGÖ, John ELVESJÖ, Peter BLIXT
-
Patent number: 8339446Abstract: An eye tracker includes at least one illuminator for illuminating an eye, at least two cameras for imaging the eye, and a controller. The configuration of the reference illuminator(s) and cameras is such that, at least one camera is coaxial with a reference illuminator and at least one camera is non-coaxial with a reference illuminator. The controller is adapted to select one of the cameras to be active to maximize an image quality metric and avoid obscuring objects. The eye tracker is operable in a dual-camera mode to improve accuracy. A method and computer-program product for selecting a combination of an active reference illuminator from a number of reference illuminators, and an active camera from a plurality of cameras are provided.Type: GrantFiled: April 1, 2010Date of Patent: December 25, 2012Assignee: Tobii Technology ABInventors: Peter Blixt, Anders Hoglund, Gunnar Troili, John Elvesjö, Mårten Skogö
-
Publication number: 20120146895Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.Type: ApplicationFiled: December 22, 2011Publication date: June 14, 2012Inventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
-
Patent number: 8185845Abstract: A computer based eye-tracking solution is disclosed. A computer apparatus is associated with one or more graphical displays (GUI components) that may be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine is adapted to produce a set of non-cursor controlling event output signals, which influence the GUI-components. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the proposed event engine receives a control signal request from each of the GUI-components. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the GUI-components in accordance with each respective control signal request.Type: GrantFiled: May 24, 2005Date of Patent: May 22, 2012Assignee: Tobii Technology ABInventors: Christoffer Bjorklund, Henrik Eskilsson, Magnus Jacobson, Marten Skogo
-
Patent number: 8066375Abstract: The present invention relates to automatic registration and tracking of the eyes of at least one subject. An optical system, including a lens structure, a mask and an image sensor, receives incoming light from a scene containing the subject and directs at least a portion of this light towards the image sensor, which registers spatially distributed light and thus produces primary data. The mask is to alter a basic optical transfer function of the lens structure and the image sensor into an enhanced optical transfer function, which is substantially less sensitive to variations of an unknown distance between the optical system and the subject than the basic optical transfer function. The processing unit is to receive the primary data and process this data to produce resulting eye-tracking data representing a position estimate of the at least one eye and/or a gaze direction for the at least one eye.Type: GrantFiled: August 28, 2006Date of Patent: November 29, 2011Assignee: Tobii Technology ABInventors: Mårten Skogö, John Elvesjö, Bengt Rehnström
-
Publication number: 20110279666Abstract: A gaze-point detection system includes at least one infrared (IR) signal source to be placed in a test scene as a reference point, a pair of eye glasses to be worn by a person, and a data processing and storage unit for calculating a gaze point of the person wearing the pair of eye glasses. The pair of eye glasses includes an image sensor, an eye-tracking unit and a camera. The image sensor detects IR signals from the at least one IR signal source and generates an IR signal source tracking signal. The eye-tracking unit determines adapted to determine the gaze direction of the person and generates an eye-tracking signal, and the camera acquires a test scene picture. The data processing and storage unit communicates with the pair of eye glasses and calculates the gaze point relative to the test scene picture.Type: ApplicationFiled: January 26, 2009Publication date: November 17, 2011Inventors: Johan Strömbom, Simon Gustafsson, Marten Skogo, John Mikael Holtz Elvesjö, Per Nystedt, Peter Blixt
-
Publication number: 20100328444Abstract: An eye tracker includes at least one illuminator for illuminating an eye, at least two cameras for imaging the eye, and a controller. The configuration of the reference illuminator(s) and cameras is such that, at least one camera is coaxial with a reference illuminator and at least one camera is non-coaxial with a reference illuminator. The controller is adapted to select one of the cameras to be active to maximize an image quality metric and avoid obscuring objects. The eye tracker is operable in a dual-camera mode to improve accuracy. A method and computer-program product for selecting a combination of an active reference illuminator from a number of reference illuminators, and an active camera from a plurality of cameras are provided.Type: ApplicationFiled: April 1, 2010Publication date: December 30, 2010Applicant: Tobii Technology ABInventors: PETER BLIXT, Anders Hoglund, Gunnar Troili, John Elvesjö, Mårten Skogö
-
Patent number: 7572008Abstract: When detecting the position and gaze direction of eyes, a photo sensor (1) and light sources (2, 3) placed around a display (1) and a calculation and control unit (6) are used. One of the light sources is placed around the sensor and includes inner and outer elements (3?; 3?). When only the inner elements are illuminated, a strong bright eye effect in a captured image is obtained, this resulting in a simple detection of the pupils and thereby a safe determination of gaze direction. When only the outer elements and the outer light sources (2) are illuminated, a determination of the distance of the eye from the photo sensor is made. After it has been possible to determine the pupils in an image, in the following captured images only those areas around the pupils are evaluated where the images of the eyes are located. Which one of the eyes that is the left eye and the right eye can be determined by following the images of the eyes and evaluating the positions thereof in successively captured images.Type: GrantFiled: November 21, 2003Date of Patent: August 11, 2009Assignee: Tobii Technology ABInventors: John Elvesjo, Marten Skogo, Gunnar Elvers
-
Publication number: 20080284980Abstract: The present invention relates to automatic registration and tracking of the eyes of at least one subject. An optical system, including a lens structure, a mask and an image sensor, receives incoming light from a scene containing the subject and directs at least a portion of this light towards the image sensor, which registers spatially distributed light and thus produces primary data. The mask is adapted to alter a basic optical transfer function of the lens structure and the image sensor into an enhanced optical transfer function, which is substantially less sensitive to variations of an unknown distance between the optical system and the subject than the basic optical transfer function. The processing unit is adapted to receive the primary data and process this data to produce resulting eye-tracking data representing a position estimate of the at least one eye and/or a gaze direction for the at least one eye.Type: ApplicationFiled: August 28, 2006Publication date: November 20, 2008Applicant: Tobii Technology ABInventors: Marten Skogo, John Elvesjo, Bengt Rehnstrom
-
Patent number: 7266995Abstract: For measuring the surface tension between a liquid and fluid such as a gas, a capillary (3, 3?) is used in which the liquid slowly flows and at the end of which drops (11) are formed, falling off into a closed space (7) containing the fluid. Using a pressure sensor (5, 5?) the pressure is measured which can be the absolute pressure of a fluid volume enclosed in the closed space or alternatively a differential pressure measured as the pressure difference between the liquid in the capillary and fluid contained in the closed space. The pressure is measured when one or more drops are formed and full off. The obtained pressure curves are evaluated electronically (12) and provide a value of the surface tension. The measurement can be made within a fairly short time with a high operational reliability. The temperature difference between the drop and the surrounding fluid is small resulting in a little precipitation of salts dissolved in the liquid, reducing the risk that the liquid capillary with be blocked.Type: GrantFiled: August 9, 2002Date of Patent: September 11, 2007Assignee: Jenser Technology ABInventors: Mårten Skogö, John Elvesjö
-
Publication number: 20070164990Abstract: A computer based eye-tracking solution is disclosed. A computer apparatus is associated with one or more graphical displays (GUI components) that may be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine is adapted to produce a set of non-cursor controlling event output signals, which influence the GUI-components. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the proposed event engine receives a control signal request from each of the GUI-components. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the GUI-components in accordance with each respective control signal request.Type: ApplicationFiled: May 24, 2005Publication date: July 19, 2007Inventors: Christoffer Bjorklund, Henrik Eskilsson, Magnus Jacobson, Marten Skogo
-
Publication number: 20040177680Abstract: For measuring the surface tension between a liquid and fluid such as a gas, a capillary (3,3′) is used in which the liquid slowly flows and at the end of which drops (11) are formed, falling off into a closed space (7) containing the fluid. Using a pressure sensor (5, 5′) the pressure is measured which can be the absolute pressure of a fluid volume enclosed in the closed space or alternatively a differential pressure measured as the pressure difference between the liquid in the capillary and fluid contained in the closed space. The pressure is measured when one or more drops are formed and fall off. The obtained pressure curves are evaluated electronically (12) and provide a value of the surface tension. The measurement can be made within a fairly short time with a high operational reliability.Type: ApplicationFiled: May 6, 2004Publication date: September 16, 2004Inventors: Marten Skogo, John Elvesjo