Patents by Inventor Jonathan T. Campbell

Jonathan T. Campbell has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11907419
    Abstract: Systems and methods disclosed herein are related to an intelligent UI element selection system using eye-gaze technology. In some example aspects, a UI element selection zone may be determined. The selection zone may be defined as an area surrounding a boundary of the UI element. Gaze input may be received and the gaze input may be compared with the selection zone to determine an intent of the user. The gaze input may comprise one or more gaze locations. Each gaze location may be assigned a value according to its proximity to the UI element and/or its relation to the UI element's selection zone. Each UI element may be assigned a threshold. If the aggregated value of gaze input is equal to or greater than the threshold for the UI element, then the UI element may be selected.
    Type: Grant
    Filed: June 29, 2021
    Date of Patent: February 20, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Narasimhan Raghunath, Austin B. Hodges, Fei Su, Akhilesh Kaza, Peter John Ansell, Jonathan T. Campbell, Harish S. Kulkarni
  • Patent number: 11880545
    Abstract: Systems and methods disclosed herein relate to assigning dynamic eye-gaze dwell-times. Dynamic dwell-times may be tailored to the individual user. For example, a dynamic dwell-time system may be configured to receive data from the user, such as the duration of time the user takes to execute certain actions within applications (e.g., read a word suggestion before actually selecting it). The dynamic dwell-time system may also prevent users from making unintended selections by providing different dwell times for different buttons. Specifically, on a user interface, longer dwell times may be established for the critical keys (e.g., “close” program key, “send” key, word suggestions, and the like) and shorter dwell times may be established for the less critical keys (e.g., individual character keys on a virtual keyboard, spacebar, backspace, and the like).
    Type: Grant
    Filed: June 24, 2021
    Date of Patent: January 23, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Dmytro Rudchenko, Eric N. Badger, Akhilesh Kaza, Jacob Daniel Cohen, Peter John Ansell, Jonathan T. Campbell, Harish S. Kulkarni
  • Publication number: 20230195224
    Abstract: Systems and methods are provided for predicting an eye gaze location of an operator of a computing device. In particular, the method generates an image grid that includes regions of interest based on a facial image. The facial image is based on a received image frame of a video stream that captures the operator using the computing device. The image grid further includes a region that indicate rotation information of the face. The method further uses a combination of trained neural networks to extract features of the regions of interest in the image grid and predict the eye gaze location on the screen of the computing device. The trained set of neural networks includes a convolutional neural network. The method optionally generate head pose pitch, roll, and yaw information to improve accuracy of predicting the location of an eye gaze.
    Type: Application
    Filed: February 23, 2023
    Publication date: June 22, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jatin SHARMA, Jonathan T. CAMPBELL, Jay C. BEAVERS, Peter John ANSELL
  • Patent number: 11619993
    Abstract: Systems and methods are provided for predicting an eye gaze location of an operator of a computing device. In particular, the method generates an image grid that includes regions of interest based on a facial image. The facial image is based on a received image frame of a video stream that captures the operator using the computing device. The image grid further includes a region that indicate rotation information of the face. The method further uses a combination of trained neural networks to extract features of the regions of interest in the image grid and predict the eye gaze location on the screen of the computing device. The trained set of neural networks includes a convolutional neural network. The method optionally generate head pose pitch, roll, and yaw information to improve accuracy of predicting the location of an eye gaze.
    Type: Grant
    Filed: April 19, 2021
    Date of Patent: April 4, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jatin Sharma, Jonathan T. Campbell, Jay C. Beavers, Peter John Ansell
  • Publication number: 20220334637
    Abstract: Systems and methods are provided for predicting an eye gaze location of an operator of a computing device. In particular, the method generates an image grid that includes regions of interest based on a facial image. The facial image is based on a received image frame of a video stream that captures the operator using the computing device. The image grid further includes a region that indicate rotation information of the face. The method further uses a combination of trained neural networks to extract features of the regions of interest in the image grid and predict the eye gaze location on the screen of the computing device. The trained set of neural networks includes a convolutional neural network. The method optionally generate head pose pitch, roll, and yaw information to improve accuracy of predicting the location of an eye gaze.
    Type: Application
    Filed: April 19, 2021
    Publication date: October 20, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jatin SHARMA, Jonathan T. CAMPBELL, Jay C. BEAVERS, Peter John ANSELL
  • Publication number: 20220330863
    Abstract: Systems and methods are provided for collecting eye-gaze data for training an eye-gaze prediction model. The collecting includes selecting a scan path that passes through a series of regions of a grid on a screen of a computing device, moving a symbol as an eye-gaze target along the scan path, and receiving facial images at eye-gaze points. The eye-gaze points are uniformly distributed within the respective regions. Areas of the regions that are adjacent to edges and corners of the screen are smaller than other regions. The difference in areas shifts centers of the regions toward the edges, density of data closer to the edges. The scan path passes through locations in proximity to the edges and corners of the screen for capturing more eye-gaze points in the proximity. The methods interactively enhance variations of facial images by displaying instructions to the user to make specific actions associated with the face.
    Type: Application
    Filed: April 19, 2021
    Publication date: October 20, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jatin SHARMA, Jonathan T. CAMPBELL, Jay C. BEAVERS, Peter John ANSELL
  • Publication number: 20210325962
    Abstract: Systems and methods disclosed herein are related to an intelligent UI element selection system using eye-gaze technology. In some example aspects, a UI element selection zone may be determined. The selection zone may be defined as an area surrounding a boundary of the UI element. Gaze input may be received and the gaze input may be compared with the selection zone to determine an intent of the user. The gaze input may comprise one or more gaze locations. Each gaze location may be assigned a value according to its proximity to the UI element and/or its relation to the UI element's selection zone. Each UI element may be assigned a threshold. If the aggregated value of gaze input is equal to or greater than the threshold for the UI element, then the UI element may be selected.
    Type: Application
    Filed: June 29, 2021
    Publication date: October 21, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Narasimhan RAGHUNATH, Austin B. HODGES, Fei SU, Akhilesh KAZA, Peter John ANSELL, Jonathan T. CAMPBELL, Harish S. KULKARNI
  • Publication number: 20210318794
    Abstract: Systems and methods disclosed herein relate to assigning dynamic eye-gaze dwell-times. Dynamic dwell-times may be tailored to the individual user. For example, a dynamic dwell-time system may be configured to receive data from the user, such as the duration of time the user takes to execute certain actions within applications (e.g., read a word suggestion before actually selecting it). The dynamic dwell-time system may also prevent users from making unintended selections by providing different dwell times for different buttons. Specifically, on a user interface, longer dwell times may be established for the critical keys (e.g., “close” program key, “send” key, word suggestions, and the like) and shorter dwell times may be established for the less critical keys (e.g., individual character keys on a virtual keyboard, spacebar, backspace, and the like).
    Type: Application
    Filed: June 24, 2021
    Publication date: October 14, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Dmytro RUDCHENKO, Eric N. BADGER, Akhilesh KAZA, Jacob Daniel COHEN, Peter John ANSELL, Jonathan T. CAMPBELL, Harish S. KULKARNI
  • Patent number: 11079899
    Abstract: Systems and methods disclosed herein relate to assigning dynamic eye-gaze dwell-times. Dynamic dwell-times may be tailored to the individual user. For example, a dynamic dwell-time system may be configured to receive data from the user, such as the duration of time the user takes to execute certain actions within applications (e.g., read a word suggestion before actually selecting it). The dynamic dwell-time system may also prevent users from making unintended selections by providing different dwell times for different buttons. Specifically, on a user interface, longer dwell times may be established for the critical keys (e.g., “close” program key, “send” key, word suggestions, and the like) and shorter dwell times may be established for the less critical keys (e.g., individual character keys on a virtual keyboard, spacebar, backspace, and the like).
    Type: Grant
    Filed: December 13, 2017
    Date of Patent: August 3, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Dmytro Rudchenko, Eric N. Badger, Akhilesh Kaza, Jacob Daniel Cohen, Peter John Ansell, Jonathan T. Campbell, Harish S. Kulkarni
  • Patent number: 11073904
    Abstract: Systems and methods disclosed herein are related to an intelligent UI element selection system using eye-gaze technology. In some example aspects, a UI element selection zone may be determined. The selection zone may be defined as an area surrounding a boundary of the UI element. Gaze input may be received and the gaze input may be compared with the selection zone to determine an intent of the user. The gaze input may comprise one or more gaze locations. Each gaze location may be assigned a value according to its proximity to the UI element and/or its relation to the UI element's selection zone. Each UI element may be assigned a threshold. If the aggregated value of gaze input is equal to or greater than the threshold for the UI element, then the UI element may be selected.
    Type: Grant
    Filed: December 13, 2017
    Date of Patent: July 27, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Narasimhan Raghunath, Austin B. Hodges, Fei Su, Akhilesh Kaza, Peter John Ansell, Jonathan T. Campbell, Harish S. Kulkarni
  • Patent number: 10866633
    Abstract: A user's signature may be decomposed into one or more components. The components may be described using one or more control points. A user may sign with their eyes by focusing their gaze on a set of these control points that make up the signature. If the user's gaze is within a threshold distance from a control point the signature is validated. Moreover, by modifying the control points based upon the actual gaze location (which is within the threshold distance), the signature may be slightly modified. For example, the signature may be decomposed into one or more components as Bezier curves and the user may be asked to focus on each control point of each of the one or more Bezier curves. Modifying the control points of a Bezier curve slightly still produces a smooth curve, but introduces natural variations.
    Type: Grant
    Filed: February 28, 2017
    Date of Patent: December 15, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jonathan T Campbell, Harish S Kulkarni
  • Publication number: 20190033965
    Abstract: Systems and methods disclosed herein are related to an intelligent UI element selection system using eye-gaze technology. In some example aspects, a UI element selection zone may be determined. The selection zone may be defined as an area surrounding a boundary of the UI element. Gaze input may be received and the gaze input may be compared with the selection zone to determine an intent of the user. The gaze input may comprise one or more gaze locations. Each gaze location may be assigned a value according to its proximity to the UI element and/or its relation to the UI element's selection zone. Each UI element may be assigned a threshold. If the aggregated value of gaze input is equal to or greater than the threshold for the UI element, then the UI element may be selected.
    Type: Application
    Filed: December 13, 2017
    Publication date: January 31, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Narasimhan RAGHUNATH, Austin B. HODGES, Fei SU, Akhilesh KAZA, Peter John ANSELL, Jonathan T. CAMPBELL, Harish S. KULKARNI
  • Publication number: 20190034057
    Abstract: Systems and methods disclosed herein relate to assigning dynamic eye-gaze dwell-times. Dynamic dwell-times may be tailored to the individual user. For example, a dynamic dwell-time system may be configured to receive data from the user, such as the duration of time the user takes to execute certain actions within applications (e.g., read a word suggestion before actually selecting it). The dynamic dwell-time system may also prevent users from making unintended selections by providing different dwell times for different buttons. Specifically, on a user interface, longer dwell times may be established for the critical keys (e.g., “close” program key, “send” key, word suggestions, and the like) and shorter dwell times may be established for the less critical keys (e.g., individual character keys on a virtual keyboard, spacebar, backspace, and the like).
    Type: Application
    Filed: December 13, 2017
    Publication date: January 31, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Dmytro RUDCHENKO, Eric N. BADGER, Akhilesh KAZA, Jacob Daniel COHEN, Peter John ANSELL, Jonathan T. CAMPBELL, Harish S. KULKARNI
  • Publication number: 20180246567
    Abstract: A user's signature may be decomposed into one or more components. The components may be described using one or more control points. A user may sign with their eyes by focusing their gaze on a set of these control points that make up the signature. If the user's gaze is within a threshold distance from a control point the signature is validated. Moreover, by modifying the control points based upon the actual gaze location (which is within the threshold distance), the signature may be slightly modified. For example, the signature may be decomposed into one or more components as Bezier curves and the user may be asked to focus on each control point of each of the one or more Bezier curves. Modifying the control points of a Bezier curve slightly still produces a smooth curve, but introduces natural variations.
    Type: Application
    Filed: February 28, 2017
    Publication date: August 30, 2018
    Inventors: Jonathan T. Campbell, Harish S. Kulkarni
  • Patent number: 8442937
    Abstract: A first database and a second database have different schemas. An activity in a workflow accesses a data item in a list by invoking a method in an interface of a list object. The list comprises a collection of data items. Each data item in the list comprises an item key field specifying an item key. No two data items in the data item collection have item key fields specifying a shared item key. In addition, each data item in the list comprises a set of additional fields. Each field in the set of additional fields has a value derived from the first database or each field in the set of additional fields has a value derived from the second database. In this way, the activity can be implemented without knowledge of the different schemas of the first database and the second database.
    Type: Grant
    Filed: March 31, 2009
    Date of Patent: May 14, 2013
    Assignee: Microsoft Corporation
    Inventors: Sean K. Gabriel, Alexander Malek, Mark E. Phair, Eray Chou, Jonathan T. Campbell, Bradley C. Stevenson, Constantin Stanciu
  • Publication number: 20100250487
    Abstract: A first database and a second database have different schemas. An activity in a workflow accesses a data item in a list by invoking a method in an interface of a list object. The list comprises a collection of data items. Each data item in the list comprises an item key field specifying an item key. No two data items in the data item collection have item key fields specifying a shared item key. In addition, each data item in the list comprises a set of additional fields. Each field in the set of additional fields has a value derived from the first database or each field in the set of additional fields has a value derived from the second database. In this way, the activity can be implemented without knowledge of the different schemas of the first database and the second database.
    Type: Application
    Filed: March 31, 2009
    Publication date: September 30, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Sean K. Gabriel, Alexander Malek, Mark E. Phair, Eray Chou, Jonathan T. Campbell, Bradley C. Stevenson, Constantin Stanciu