Patents by Inventor Ibrahim Eden

Ibrahim Eden has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9183197
    Abstract: Automated language translation often involves language translation resources of significant size (e.g., 50-gigabyte phrase tables) and significant computational power exceeding the capabilities of many mobile devices. Remotely accessible servers capable of near-realtime, automated translation may be inaccessible or prohibitively costly while traveling abroad. Presented herein are adaptations of language translation techniques for offline mobile devices involving reducing the size and raising the efficiency of the language modeling resources. A word index may be provided that stores respective string representations of the words of a language, and maps respective words to a location (e.g., address or offset) of respective word representations within the word index. Language translation resources (e.g., phrase tables) may then specify logical relationships using the word index addresses of the involved words, rather than the string equivalents.
    Type: Grant
    Filed: December 14, 2012
    Date of Patent: November 10, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Ibrahim Eden, Christopher Quirk, Anthony Aue, Michel Galley, Frederik Schaffalitzky
  • Publication number: 20150310253
    Abstract: Embodiments are disclosed for eye tracking systems and methods. An example eye tracking system comprises a plurality of light sources and a camera configured to capture an image of light from the light sources as reflected from an eye. The eye tracking system further comprises a logic device and a storage device storing instructions executable by the logic device to acquire frames of eye tracking data by iteratively projecting light from different combinations of light sources of the plurality of light sources and capturing an image of the eye during projection of each combination. The instructions may be further executable to select a selected combination of light sources for eye tracking based on a determination of occlusion detected in the image arising from a transparent or semi-transparent optical structure positioned between the eye and the camera and project light via the selected combination of light sources for eye tracking.
    Type: Application
    Filed: April 29, 2014
    Publication date: October 29, 2015
    Inventors: Mudit Agrawal, Vaibhav Thukral, Ibrahim Eden, David Nister, Shivkumar Swaminathan
  • Publication number: 20150310657
    Abstract: Examples relating to attracting the gaze of a viewer of a display are disclosed. One example method comprises controlling the display to display a target object and using gaze tracking data to monitor a viewer gaze location. A guide element is displayed moving along a computed dynamic path that traverses adjacent to a viewer gaze location and leads to the target object. If the viewer's gaze location is within a predetermined divergence threshold of the guide element, then the display continues displaying the guide element moving along the computed dynamic guide path to the target object. If the viewer's gaze location diverts from the guide element by at least the predetermined divergence threshold, then the display discontinues displaying the guide element moving along the computed dynamic guide path to the target object.
    Type: Application
    Filed: April 29, 2014
    Publication date: October 29, 2015
    Inventor: Ibrahim Eden
  • Publication number: 20150293587
    Abstract: Examples relating to using non-visual feedback to alert a viewer of a display that a visual change has been triggered are disclosed. One disclosed example provides a method comprising using gaze tracking data from a gaze tracking system to determine that a viewer changes a gaze location. Based on determining that the viewer changes the gaze location, a visual change is triggered and non-visual feedback indicating the triggering of the visual change is provided to the viewer. If a cancel change input is received within a predetermined timeframe, the visual change is not displayed. If a cancel change input is not received within the timeframe, the visual change is displayed via the display.
    Type: Application
    Filed: April 10, 2014
    Publication date: October 15, 2015
    Inventors: Weerapan Wilairat, Ibrahim Eden, Vaibhav Thukral, David Nister, Vivek Pradeep
  • Publication number: 20150261293
    Abstract: Embodiments are disclosed that relate to gaze-based remote device control. For example, one disclosed embodiment provides, on a computing device, a method comprising detecting a gaze direction of a user, detecting an indication from the user to control a remotely controllable device located in the gaze direction, and adapting a user interface of a controller device to enable user control of the remotely controllable device.
    Type: Application
    Filed: March 12, 2014
    Publication date: September 17, 2015
    Inventors: Weerapan Wilairat, Vaibhav Thukral, Ibrahim Eden, David Nister
  • Publication number: 20140375541
    Abstract: Embodiments are disclosed that relate to tracking a user's eye based on time-of-flight depth image data of the user's eye are disclosed. For example, one disclosed embodiment provides an eye tracking system comprising a light source, a sensing subsystem configured to obtain a two-dimensional image of a user's eye and depth data of the user's eye using a depth sensor having an unconstrained baseline distance, and a logic subsystem configured to control the light source to emit light, control the sensing subsystem to acquire a two-dimensional image of the user's eye while illuminating the light source, control the sensing subsystem to acquire depth data of the user's eye, determine a gaze direction of the user's eye from the two-dimensional image, determine a location on a display at which the gaze direction intersects the display based on the gaze direction and the depth data, and output the location.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: David Nister, Ibrahim Eden
  • Publication number: 20140361996
    Abstract: Embodiments are disclosed that relate to calibrating an eye tracking system via touch inputs. For example, one disclosed embodiment provides, on a computing system comprising a touch sensitive display and an eye tracking system, a method comprising displaying a user interface on the touch sensitive display, determining a gaze location via the eye tracking system, receiving a touch input at a touch location on the touch sensitive display, and calibrating the eye tracking system based upon an offset between the gaze location and the touch location.
    Type: Application
    Filed: June 6, 2013
    Publication date: December 11, 2014
    Inventors: Ibrahim Eden, Ruchita Bhargava
  • Publication number: 20140172407
    Abstract: Automated language translation often involves language translation resources of significant size (e.g., 50-gigabyte phrase tables) and significant computational power exceeding the capabilities of many mobile devices. Remotely accessible servers capable of near-realtime, automated translation may be inaccessible or prohibitively costly while traveling abroad. Presented herein are adaptations of language translation techniques for offline mobile devices involving reducing the size and raising the efficiency of the language modeling resources. A word index may be provided that stores respective string representations of the words of a language, and maps respective words to a location (e.g., address or offset) of respective word representations within the word index. Language translation resources (e.g., phrase tables) may then specify logical relationships using the word index addresses of the involved words, rather than the string equivalents.
    Type: Application
    Filed: December 14, 2012
    Publication date: June 19, 2014
    Applicant: Microsoft Corporation
    Inventors: Ibrahim Eden, Christopher Quirk, Anthony Aue, Michel Galley, Frederik Schaffalitzky