Patents by Inventor Eric N. Badger

Eric N. Badger has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11921966
    Abstract: Systems and methods related to intelligent typing and responses using eye-gaze technology are disclosed herein. In some example aspects, a dwell-free typing system is provided to a user typing with eye-gaze. A prediction processor may intelligently determine the desired word or action of the user. In some aspects, the prediction processor may contain elements of a natural language processor. In other aspects, the systems and methods may allow quicker response times from applications due to application of intelligent response algorithms. For example, a user may fixate on a certain button within a web-browser, and the prediction processor may present a response to the user by selecting the button in the web-browser, thereby initiating an action. In other example aspects, each gaze location may be associated with a UI element. The gaze data and associated UI elements may be processed for intelligent predictions and suggestions.
    Type: Grant
    Filed: January 31, 2022
    Date of Patent: March 5, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Dmytro Rudchenko, Eric N. Badger, Akhilesh Kaza, Jacob Daniel Cohen, Harish S. Kulkarni
  • Patent number: 11915671
    Abstract: Techniques for providing adaptive assistive technology for assisting users with visual impairment can be used on a computing device. These techniques include displaying content to a user, capturing a series of images or video of the user using a camera, analyzing the series of images or video to determine whether the user is exhibiting behavior or characteristics indicative of visual impairment, and rendering a magnification user interface on the display configured to magnify at least a portion of the content of the display based on a determination that the user is exhibiting behavior or characteristics indicative of visual impairment. The magnification user interface may be controlled based on head and/or eye movements of the user of the computing device.
    Type: Grant
    Filed: July 11, 2022
    Date of Patent: February 27, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jason A. Grieves, Eric N. Badger, Grant M. Wynn, Paul J. Olczak, Christian Klein
  • Patent number: 11880545
    Abstract: Systems and methods disclosed herein relate to assigning dynamic eye-gaze dwell-times. Dynamic dwell-times may be tailored to the individual user. For example, a dynamic dwell-time system may be configured to receive data from the user, such as the duration of time the user takes to execute certain actions within applications (e.g., read a word suggestion before actually selecting it). The dynamic dwell-time system may also prevent users from making unintended selections by providing different dwell times for different buttons. Specifically, on a user interface, longer dwell times may be established for the critical keys (e.g., “close” program key, “send” key, word suggestions, and the like) and shorter dwell times may be established for the less critical keys (e.g., individual character keys on a virtual keyboard, spacebar, backspace, and the like).
    Type: Grant
    Filed: June 24, 2021
    Date of Patent: January 23, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Dmytro Rudchenko, Eric N. Badger, Akhilesh Kaza, Jacob Daniel Cohen, Peter John Ansell, Jonathan T. Campbell, Harish S. Kulkarni
  • Publication number: 20220366874
    Abstract: Techniques for providing adaptive assistive technology for assisting users with visual impairment can be used on a computing device. These techniques include displaying content to a user, capturing a series of images or video of the user using a camera, analyzing the series of images or video to determine whether the user is exhibiting behavior or characteristics indicative of visual impairment, and rendering a magnification user interface on the display configured to magnify at least a portion of the content of the display based on a determination that the user is exhibiting behavior or characteristics indicative of visual impairment. The magnification user interface may be controlled based on head and/or eye movements of the user of the computing device.
    Type: Application
    Filed: July 11, 2022
    Publication date: November 17, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jason A. Grieves, Eric N. Badger, Grant M. Wynn, Paul J. Olczak, Christian Klein
  • Patent number: 11430414
    Abstract: Techniques for providing adaptive assistive technology for assisting users with visual impairment can be used on a computing device. These techniques include displaying content to a user, capturing a series of images or video of the user using a camera, analyzing the series of images or video to determine whether the user is exhibiting behavior or characteristics indicative of visual impairment, and rendering a magnification user interface on the display configured to magnify at least a portion of the content of the display based on a determination that the user is exhibiting behavior or characteristics indicative of visual impairment. The magnification user interface may be controlled based on head and/or eye movements of the user of the computing device.
    Type: Grant
    Filed: October 17, 2019
    Date of Patent: August 30, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jason A. Grieves, Eric N. Badger, Grant M. Wynn, Paul J. Olczak, Christian Klein
  • Publication number: 20220155912
    Abstract: Systems and methods related to intelligent typing and responses using eye-gaze technology are disclosed herein. In some example aspects, a dwell-free typing system is provided to a user typing with eye-gaze. A prediction processor may intelligently determine the desired word or action of the user. In some aspects, the prediction processor may contain elements of a natural language processor. In other aspects, the systems and methods may allow quicker response times from applications due to application of intelligent response algorithms. For example, a user may fixate on a certain button within a web-browser, and the prediction processor may present a response to the user by selecting the button in the web-browser, thereby initiating an action. In other example aspects, each gaze location may be associated with a UI element. The gaze data and associated UI elements may be processed for intelligent predictions and suggestions.
    Type: Application
    Filed: January 31, 2022
    Publication date: May 19, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Dmytro RUDCHENKO, Eric N. BADGER, Akhilesh KAZA, Jacob Daniel COHEN, Harish S. KULKARNI
  • Publication number: 20220155911
    Abstract: Systems and methods related to intelligent typing and responses using eye-gaze technology are disclosed herein. In some example aspects, a dwell-free typing system is provided to a user typing with eye-gaze. A prediction processor may intelligently determine the desired word or action of the user. In some aspects, the prediction processor may contain elements of a natural language processor. In other aspects, the systems and methods may allow quicker response times from applications due to application of intelligent response algorithms. For example, a user may fixate on a certain button within a web-browser, and the prediction processor may present a response to the user by selecting the button in the web-browser, thereby initiating an action. In other example aspects, each gaze location may be associated with a UI element. The gaze data and associated UI elements may be processed for intelligent predictions and suggestions.
    Type: Application
    Filed: January 31, 2022
    Publication date: May 19, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Dmytro RUDCHENKO, Eric N. BADGER, Akhilesh KAZA, Jacob Daniel COHEN, Harish S. KULKARNI
  • Patent number: 11237691
    Abstract: Systems and methods related to intelligent typing and responses using eye-gaze technology are disclosed herein. In some example aspects, a dwell-free typing system is provided to a user typing with eye-gaze. A prediction processor may intelligently determine the desired word or action of the user. In some aspects, the prediction processor may contain elements of a natural language processor. In other aspects, the systems and methods may allow quicker response times from applications due to application of intelligent response algorithms. For example, a user may fixate on a certain button within a web-browser, and the prediction processor may present a response to the user by selecting the button in the web-browser, thereby initiating an action. In other example aspects, each gaze location may be associated with a UI element. The gaze data and associated UI elements may be processed for intelligent predictions and suggestions.
    Type: Grant
    Filed: December 13, 2017
    Date of Patent: February 1, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Dmytro Rudchenko, Eric N. Badger, Akhilesh Kaza, Jacob Daniel Cohen, Harish S. Kulkarni
  • Publication number: 20210318794
    Abstract: Systems and methods disclosed herein relate to assigning dynamic eye-gaze dwell-times. Dynamic dwell-times may be tailored to the individual user. For example, a dynamic dwell-time system may be configured to receive data from the user, such as the duration of time the user takes to execute certain actions within applications (e.g., read a word suggestion before actually selecting it). The dynamic dwell-time system may also prevent users from making unintended selections by providing different dwell times for different buttons. Specifically, on a user interface, longer dwell times may be established for the critical keys (e.g., “close” program key, “send” key, word suggestions, and the like) and shorter dwell times may be established for the less critical keys (e.g., individual character keys on a virtual keyboard, spacebar, backspace, and the like).
    Type: Application
    Filed: June 24, 2021
    Publication date: October 14, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Dmytro RUDCHENKO, Eric N. BADGER, Akhilesh KAZA, Jacob Daniel COHEN, Peter John ANSELL, Jonathan T. CAMPBELL, Harish S. KULKARNI
  • Patent number: 11079899
    Abstract: Systems and methods disclosed herein relate to assigning dynamic eye-gaze dwell-times. Dynamic dwell-times may be tailored to the individual user. For example, a dynamic dwell-time system may be configured to receive data from the user, such as the duration of time the user takes to execute certain actions within applications (e.g., read a word suggestion before actually selecting it). The dynamic dwell-time system may also prevent users from making unintended selections by providing different dwell times for different buttons. Specifically, on a user interface, longer dwell times may be established for the critical keys (e.g., “close” program key, “send” key, word suggestions, and the like) and shorter dwell times may be established for the less critical keys (e.g., individual character keys on a virtual keyboard, spacebar, backspace, and the like).
    Type: Grant
    Filed: December 13, 2017
    Date of Patent: August 3, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Dmytro Rudchenko, Eric N. Badger, Akhilesh Kaza, Jacob Daniel Cohen, Peter John Ansell, Jonathan T. Campbell, Harish S. Kulkarni
  • Publication number: 20210117048
    Abstract: Adaptive assistance technologies can assist a user with operating a computing device. A method according to these techniques includes analyzing user interactions with the computing device, determining that the user interactions with the computing device are indicative of a user experiencing one or more issues for which adaptive assistive technologies provided by the computing device may assist the user, identifying one or more assistive technologies provided by the computing device that may address the one or more issues, modifying one or more operating parameters of the computing device using the one or more assistive technologies, and operating the computing device according to the one or more modified operating parameters.
    Type: Application
    Filed: October 17, 2019
    Publication date: April 22, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jason A. Grieves, Eric N. Badger, Grant M. Wynn, Paul J. Olczak, Christian Klein
  • Publication number: 20210118410
    Abstract: Techniques for providing adaptive assistive technology for assisting users with visual impairment can be used on a computing device. These techniques include displaying content to a user, capturing a series of images or video of the user using a camera, analyzing the series of images or video to determine whether the user is exhibiting behavior or characteristics indicative of visual impairment, and rendering a magnification user interface on the display configured to magnify at least a portion of the content of the display based on a determination that the user is exhibiting behavior or characteristics indicative of visual impairment. The magnification user interface may be controlled based on head and/or eye movements of the user of the computing device.
    Type: Application
    Filed: October 17, 2019
    Publication date: April 22, 2021
    Inventors: Jason A. Grieves, Eric N. Badger, Grant M. Wynn, Paul J. Olczak, Christian Klein
  • Patent number: 10496162
    Abstract: The systems and methods described herein assist persons with the use of computers based on eye gaze, and allow such persons to control such computing systems using various eye trackers. The systems and methods described herein use eye trackers to control cursor (or some other indicator) positioning on an operating system using the gaze location reported by the eye tracker. The systems and methods described herein utilize an interaction model that allows control of a computer using eye gaze and dwell. The data from eye trackers provides a gaze location on the screen. The systems and methods described herein control a graphical user interface that is part of an operating system relative to cursor positioning and associated actions such as Left-Click, Right-Click, Double-Click, and the like. The interaction model presents appropriate user interfaces to navigate the user through applications on the computing system.
    Type: Grant
    Filed: July 26, 2017
    Date of Patent: December 3, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Harish Sripad Kulkarni, Dwayne Lamb, Ann M Paradiso, Eric N Badger, Jonathan Thomas Campbell, Peter John Ansell, Jacob Daniel Cohen
  • Publication number: 20190033964
    Abstract: The systems and methods described herein assist persons with the use of computers based on eye gaze, and allow such persons to control such computing systems using various eye trackers. The systems and methods described herein use eye trackers to control cursor (or some other indicator) positioning on an operating system using the gaze location reported by the eye tracker. The systems and methods described herein utilize an interaction model that allows control of a computer using eye gaze and dwell. The data from eye trackers provides a gaze location on the screen. The systems and methods described herein control a graphical user interface that is part of an operating system relative to cursor positioning and associated actions such as Left-Click, Right-Click, Double-Click, and the like. The interaction model presents appropriate user interfaces to navigate the user through applications on the computing system.
    Type: Application
    Filed: July 26, 2017
    Publication date: January 31, 2019
    Inventors: Harish Sripad Kulkarni, Dwayne Lamb, Ann M. Paradiso, Eric N. Badger, Jonathan Thomas Campbell, Peter John Ansell, Jacob Daniel Cohen
  • Publication number: 20190034057
    Abstract: Systems and methods disclosed herein relate to assigning dynamic eye-gaze dwell-times. Dynamic dwell-times may be tailored to the individual user. For example, a dynamic dwell-time system may be configured to receive data from the user, such as the duration of time the user takes to execute certain actions within applications (e.g., read a word suggestion before actually selecting it). The dynamic dwell-time system may also prevent users from making unintended selections by providing different dwell times for different buttons. Specifically, on a user interface, longer dwell times may be established for the critical keys (e.g., “close” program key, “send” key, word suggestions, and the like) and shorter dwell times may be established for the less critical keys (e.g., individual character keys on a virtual keyboard, spacebar, backspace, and the like).
    Type: Application
    Filed: December 13, 2017
    Publication date: January 31, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Dmytro RUDCHENKO, Eric N. BADGER, Akhilesh KAZA, Jacob Daniel COHEN, Peter John ANSELL, Jonathan T. CAMPBELL, Harish S. KULKARNI
  • Publication number: 20190034038
    Abstract: Systems and methods related to intelligent typing and responses using eye-gaze technology are disclosed herein. In some example aspects, a dwell-free typing system is provided to a user typing with eye-gaze. A prediction processor may intelligently determine the desired word or action of the user. In some aspects, the prediction processor may contain elements of a natural language processor. In other aspects, the systems and methods may allow quicker response times from applications due to application of intelligent response algorithms. For example, a user may fixate on a certain button within a web-browser, and the prediction processor may present a response to the user by selecting the button in the web-browser, thereby initiating an action. In other example aspects, each gaze location may be associated with a UI element. The gaze data and associated UI elements may be processed for intelligent predictions and suggestions.
    Type: Application
    Filed: December 13, 2017
    Publication date: January 31, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Dmytro RUDCHENKO, Eric N. BADGER, Akhilesh KAZA, Jacob Daniel COHEN, Harish S. KULKARNI
  • Patent number: 8285549
    Abstract: A personality-based theme may be provided. A prompt and an input indicating a personality may be received. Next, a voice font corresponding to the personality may be determined. The voice font may then be applied to the received prompt by augmenting the voice font applied prompt with recorded phrases of the personality.
    Type: Grant
    Filed: February 24, 2012
    Date of Patent: October 9, 2012
    Assignee: Microsoft Corporation
    Inventors: Hugh A. Teegan, Eric N. Badger, Drew E. Linerud
  • Publication number: 20120150543
    Abstract: A personality-based theme may be provided. An application program may query a personality resource file for a prompt corresponding to a personality. Then the prompt may be received at a speech synthesis engine. Next, the speech synthesis engine may query a personality voice font database for a voice font corresponding to the personality. Then the speech synthesis engine may apply the voice font to the prompt. The voice font applied prompt may then be produced at an output device.
    Type: Application
    Filed: February 24, 2012
    Publication date: June 14, 2012
    Applicant: Microsoft Corporation
    Inventors: Hugh A. Teegan, Eric N. Badger, Drew E. Linerud
  • Patent number: 8131549
    Abstract: A personality-based theme may be provided to a device. An application program may query a personality resource file for a prompt corresponding to a personality. Then the prompt may be received at a text to speech synthesis engine. Next, the speech synthesis engine may query a personality voice font and recorded phrases database for a voice font corresponding to the personality and may alter the prompt text to conform with the grammatical style of the personality. Then the speech synthesis engine may apply the voice font to the prompt, which is then produced at an output device.
    Type: Grant
    Filed: May 24, 2007
    Date of Patent: March 6, 2012
    Assignee: Microsoft Corporation
    Inventors: Hugh A. Teegan, Eric N. Badger, Drew E. Linerud
  • Patent number: 8019606
    Abstract: An audible indication of a user's position within a given speech grammar framework is provided for a speech-enabled software application, and recognition of speech grammars are limited to use only when a software application that has requested a given set of speech grammars is in focus by a user of an associated mobile computing device.
    Type: Grant
    Filed: June 29, 2007
    Date of Patent: September 13, 2011
    Assignee: Microsoft Corporation
    Inventors: Eric N. Badger, Cameron Ali Etezadi