Patents by Inventor Dominic Philip Haine

Dominic Philip Haine has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11907417
    Abstract: Described herein are eye-controlled user-machine interaction systems and methods that, based on input variables that comprise orientation and motion of an eye-mounted display (EMD), assist the wearer of a contact lens carrying the EMD to control and navigate a virtual scene that may be superimposed onto the real-world environment. Various embodiments of the invention provide for smooth, intuitive, and naturally flowing eye-controlled, interactive operations between the wearer and a virtual environment. In certain embodiments, this is accomplished by revealing layers of virtual objects and content based on eye-tracking and other motion information.
    Type: Grant
    Filed: July 25, 2019
    Date of Patent: February 20, 2024
    Assignee: Tectus Corporation
    Inventors: Dominic Philip Haine, Scott Herz
  • Patent number: 11874961
    Abstract: An augmented reality device manages display of an interactive icon in a manner that enables selection by a simple and intuitive gesture. The interactive icon may initially be displayed at a predefined target position outside the fovea where it is visible in the near peripheral vision without being distracting to the user. The augmented reality device may control the icon to behave like a stationary object with respect to changes in orientation until selection or repositioning criteria are met. Upon detection repositioning criteria, the augmented reality device may reposition the icon to the target position. Selection of the icon may control functions such as alerting the user to an available notification, opening or closing a menu, or performing other actions associated with operation of the augmented reality device.
    Type: Grant
    Filed: May 9, 2022
    Date of Patent: January 16, 2024
    Assignee: TECTUS CORPORATION
    Inventors: Ben Rafael Kimel Green, Dominic Philip Haine
  • Publication number: 20230359272
    Abstract: An augmented reality device manages display of an interactive icon in a manner that enables selection by a simple and intuitive gesture. The interactive icon may initially be displayed at a predefined target position outside the fovea where it is visible in the near peripheral vision without being distracting to the user. The augmented reality device may control the icon to behave like a stationary object with respect to changes in orientation until selection or repositioning criteria are met. Upon detection repositioning criteria, the augmented reality device may reposition the icon to the target position. Selection of the icon may control functions such as alerting the user to an available notification, opening or closing a menu, or performing other actions associated with operation of the augmented reality device.
    Type: Application
    Filed: May 9, 2022
    Publication date: November 9, 2023
    Inventors: Ben Rafael Kimel Green, Dominic Philip Haine
  • Patent number: 11662807
    Abstract: The present disclosure relates generally to eye-tracking systems and methods that provide a user the ability to efficiently activate the system and select and dismiss virtual objects within an augmented reality (“AR”) and/or virtual reality (“VR”) environment. A user may activate the user interface by glancing beyond an activation threshold positioned close enough to the edge of the field of view to reliably infer an intent to activate the virtual controls. Subsequently, the user interacts with virtual tools, first virtual “peek” windows and secondary virtual windows to obtain content or virtual control across a variety of granular levels. Subsequently, the user may glance away at virtual content or to other predefined areas within their eye's range of motion to dismiss the tool and/or deactivate the system.
    Type: Grant
    Filed: July 27, 2020
    Date of Patent: May 30, 2023
    Assignee: Tectus Corporation
    Inventors: Dominic Philip Haine, Scott Herz
  • Patent number: 11619994
    Abstract: A system includes an electronic contact lens that can detect eye gestures for initiating various actions. The electronic contact lens includes integrated sensors for obtaining sensor measurements characterizing eye motion. The sensor measurements are processed to detect gestures mapped to specific actions such as changing a power state of the electronic contact lens, activating or deactivating a user interface or other feature, or selecting an item from a virtual menu. The eye gestures may involve the user sequentially stabilizing at a starting pitch, executing a first motion that crosses a first pitch threshold, executing a second motion that crosses a second pitch threshold in an opposite direction from the starting pitch, and stabilizing at an ending pitch.
    Type: Grant
    Filed: January 14, 2022
    Date of Patent: April 4, 2023
    Assignee: Tectus Corporation
    Inventors: Abhishek Deepak Bhat, Dominic Philip Haine, Ben Rafael Kimel Green, Ramin Mirjalili
  • Publication number: 20230082702
    Abstract: A system includes an electronic contact lens that can detect eye gestures for initiating various actions. The electronic contact lens includes integrated sensors for obtaining sensor measurements characterizing eye motion. The sensor measurements are processed to detect gestures mapped to specific actions such as changing a power state of the electronic contact lens, activating or deactivating a user interface or other feature, or selecting an item from a virtual menu. The eye gestures may involve the user sequentially performing a first saccade quickly followed by a second saccade in an opposite direction from the first saccade.
    Type: Application
    Filed: September 10, 2021
    Publication date: March 16, 2023
    Inventors: Abhishek Deepak Bhat, Dominic Philip Haine, Ben Rafael Kimel Green, Ramin Mirjalili
  • Patent number: 11592899
    Abstract: Systems and methods for activating a button within a display are described. Embodiments of the invention provide a multi-step activation process using user eye movement within the display. The multi-step activation process comprises displaying an confirmation element in response to a first user gaze at a button. The button is subsequently activated in response to a second user gaze at the confirmation element.
    Type: Grant
    Filed: October 28, 2021
    Date of Patent: February 28, 2023
    Assignee: Tectus Corporation
    Inventors: Dominic Philip Haine, Ben Rafael Kimel Green
  • Patent number: 11343420
    Abstract: Presented in the present disclosure are system and methods embodiments that allow a user to wear a contact lens that provide a virtual framework for the user to retrieve information from one or more remote cameras and implement remote camera control via eye movement tracked by one or more motion sensors embedded within the contact lens. The remote camera control may include, but not limited to, pan, tilt, and zoom control. A user may activate projection of content captured from a remote camera and control the remote camera via an established communication link. The communication link may be a direct link or indirect link via one or more intermediate devices, e.g., a server and/or an accessory device. This unique way of projection activation and camera control by tracking eye movement provides a convenient and secure way for remote camera control without involvement of hands or voices.
    Type: Grant
    Filed: March 30, 2021
    Date of Patent: May 24, 2022
    Assignee: Tectus Corporation
    Inventors: Scott Herz, Dominic Philip Haine
  • Publication number: 20210208674
    Abstract: The present disclosure relates generally to eye-tracking systems and methods that provide a user the ability to efficiently activate the system and select and dismiss virtual objects within an augmented reality (“AR”) and/or virtual reality (“VR”) environment. A user may activate the user interface by glancing beyond an activation threshold positioned close enough to the edge of the field of view to reliably infer an intent to activate the virtual controls. Subsequently, the user interacts with virtual tools, first virtual “peek” windows and secondary virtual windows to obtain content or virtual control across a variety of granular levels. Subsequently, the user may glance away at virtual content or to other predefined areas within their eye's range of motion to dismiss the tool and/or deactivate the system.
    Type: Application
    Filed: July 27, 2020
    Publication date: July 8, 2021
    Applicant: Tectus Corporation
    Inventors: Dominic Philip HAINE, Scott HERZ
  • Publication number: 20210124415
    Abstract: Presented are eye-controlled user-machine interaction systems and methods that, based on input variables that comprise orientation and motion of an electronic contact lens, assist the wearer of the contact lens carrying a femtoprojector to control and navigate a virtual scene that may be superimposed onto the real-world environment. Various embodiments provide for smooth, intuitive, and naturally flowing eye-controlled, interactive operations between the wearer and a virtual environment. In certain embodiments, eye motion information is used to wake a smart electronic contact lens, activate tools in a virtual scene, or any combination thereof without the need for blinking, winking, hand gestures, and use of buttons.
    Type: Application
    Filed: December 10, 2020
    Publication date: April 29, 2021
    Applicant: Tectus Corporation
    Inventors: Dominic Philip HAINE, Scott HERZ, Renaldi WINOTO, Abhishek BHAT, Ramin MIRJALILI, Joseph CZOMPO
  • Publication number: 20210026444
    Abstract: Described herein are eye-controlled user-machine interaction systems and methods that, based on input variables that comprise orientation and motion of an eye-mounted display (EMD), assist the wearer of a contact lens carrying the EMD to control and navigate a virtual scene that may be superimposed onto the real-world environment. Various embodiments of the invention provide for smooth, intuitive, and naturally flowing eye-controlled, interactive operations between the wearer and a virtual environment. In certain embodiments, this is accomplished by revealing layers of virtual objects and content based on eye-tracking and other motion information.
    Type: Application
    Filed: July 25, 2019
    Publication date: January 28, 2021
    Applicant: Tectus Corporation
    Inventors: Dominic Philip HAINE, Scott HERZ
  • Patent number: 10901505
    Abstract: Presented are eye-controlled user-machine interaction systems and methods that, based on input variables that comprise orientation and motion of an electronic contact lens, assist the wearer of the contact lens carrying a femtoprojector to control and navigate a virtual scene that may be superimposed onto the real-world environment. Various embodiments provide for smooth, intuitive, and naturally flowing eye-controlled, interactive operations between the wearer and a virtual environment. In certain embodiments, eye motion information is used to wake a smart electronic contact lens, activate tools in a virtual scene, or any combination thereof without the need for blinking, winking, hand gestures, and use of buttons.
    Type: Grant
    Filed: October 24, 2019
    Date of Patent: January 26, 2021
    Assignee: Tectus Corporation
    Inventors: Dominic Philip Haine, Scott Herz, Renaldi Winoto, Abhishek Bhat, Ramin Mirjalili, Joseph Czompo
  • Patent number: 10565560
    Abstract: Techniques are described for generating and presenting alternative organizational views for an organizational chart being presented inside a browser window. These different organizational views can organize the contents of the organizational chart based on different dimensions, thus allowing the user to group employees within the organizational chart across different dimensions. Advantages of presenting these different groupings is to allow the managers to quickly check the status of their direct reports.
    Type: Grant
    Filed: November 12, 2014
    Date of Patent: February 18, 2020
    Assignee: SuccessFactors, Inc.
    Inventors: Kit Yue Zhang, David Hsia, Dominic Philip Haine
  • Patent number: 10387444
    Abstract: Some embodiments provide a non-transitory computer-readable medium that stores a program executable by at least one processing unit of a first device. The program receives from a second device a selection of a set of measures associated with data. The program also receives from the second device a selection of a set of dimensions associated with data. The program further receives from the second device a selection of a type of analysis. Based on the set of measures, the set of dimensions, and the type of analysis, the program also determines a type of visualization of the set of measures and the set of dimensions.
    Type: Grant
    Filed: December 30, 2015
    Date of Patent: August 20, 2019
    Assignee: SUCCESSFACTORS, INC.
    Inventors: Dominic Philip Haine, Maria Clarisse Gatchalian, Anthony Ashton, Lisa Meehan, Anastasia Ellerby
  • Publication number: 20190138996
    Abstract: An automated intelligent assistant receives user input including free-form text entered into a user interface (UI). The automated intelligent assistant can parse the free-form text to identify an intent of the user. The intent can be a request for employee data, employer data, and execution of tasks relating to employment of the employee by the employer. The automated intelligent assistant can deploy an extensible markup language (XML) file corresponding to the identified intent. The XML file can define a parameter and an application program interface (API) each required for implementation of that intent. The deploying can include the automated intelligent assistant prompting the user to enter the parameter defined by the XML file into the UI, calling the API defined by the XML file and passing the entered parameter thereto, and receiving an output from the API. The automated intelligent assistant can generate a response based on the API output.
    Type: Application
    Filed: November 3, 2017
    Publication date: May 9, 2019
    Inventors: Abhijit Salvi, Anil Kumar Muddasani, David Ragones, Gregory Squire, Dominic Philip Haine
  • Publication number: 20190114591
    Abstract: In one embodiment, data for creating customized views is stored in a database, the data including data about first users and tasks having steps performed by second users. The stored data further includes data for creating a pipeline data structure comprising a plurality of stages corresponding to the steps, each stage storing requirements for completing the stage. The pipeline further stores pointers associated with the data about the first users, wherein the first users are represented in the pipeline to track the progress of the first plurality of users through the plurality of steps. Each task may have a corresponding table. Columns of the table correspond to stages of the pipeline and rows of the table correspond to different users of the second users, and each cell of the table comprises rules for generating a customized view of data.
    Type: Application
    Filed: September 14, 2018
    Publication date: April 18, 2019
    Inventors: Julie Mathers, Sharosh Rajasekher, Dominic Philip Haine
  • Patent number: 10248916
    Abstract: Techniques are described for exporting organizational charts being presented inside a browser window. The system can present an export canvas that identifies the portion of the organizational chart that is to be exported. In some embodiments, the export canvas can be automatically adjusted to prevent collisions with tiles within the organizational chart. In some examples, the export canvas can be presented on a different layer than the organizational chart, thus allowing the export canvas to move around freely without disrupting the underlying organizational chart.
    Type: Grant
    Filed: November 12, 2014
    Date of Patent: April 2, 2019
    Assignee: SuccessFactors, Inc.
    Inventors: Kit Yue Zhang, David Hsia, Dominic Philip Haine
  • Patent number: 10169734
    Abstract: Techniques are described for dynamically adjusting the layout of an organizational chart being presented inside a browser window. Adjusting the layout of the organization chart has certain advantages such as ensuring that the organizational chart can be displayed in the browser window with minimal scrolling. The direct reports of a manager can be presented as a matrix in the organizational chart when the manager is selected. In one example, the layout of the matrix can be a vertical vector or a two dimensional vector depending on the number of direct reports the manager has. In another example, the layout of the matrix can change depending on the space available within the browser window to display the matrix. In other examples, the organizational chart can also be scaled based on the screen resolution or screen size of the client device that is presenting the organizational chart.
    Type: Grant
    Filed: September 19, 2014
    Date of Patent: January 1, 2019
    Assignee: SUCCESSFACTORS, INC.
    Inventors: Kit Yue Zhang, David Hsia, Dominic Philip Haine, Scott McGhee
  • Patent number: 9953022
    Abstract: Enterprise data sources can be monitored to detect metric conditions via rules, and alerts can be generated. The alerts can be presented as natural language descriptions of metric conditions. From an alert, the reader can navigate to a story page that presents additional detail and allows further navigation within the data. Additional detail presented can include a drill down synopsis, strategies for overcoming a negative condition, links to discussions within the organization about the condition, options for sharing or collaborating about the condition, or the like.
    Type: Grant
    Filed: March 18, 2016
    Date of Patent: April 24, 2018
    Assignee: SuccessFactors, Inc.
    Inventors: Dominic Philip Haine, Michael Strezo, Michel Angelo Capraro, Lyndal Hagar, Anthony Ashton, Laesa Bolwell, Dmitri Krakovsky, Thor Axel Ahlberg
  • Publication number: 20170046056
    Abstract: Some embodiments provide a non-transitory computer-readable medium that stores a program executable by at least one processing unit of a first device. The program receives from a second device a selection of a set of measures associated with data. The program also receives from the second device a selection of a set of dimensions associated with data. The program further receives from the second device a selection of a type of analysis. Based on the set of measures, the set of dimensions, and the type of analysis, the program also determines a type of visualization of the set of measures and the set of dimensions.
    Type: Application
    Filed: December 30, 2015
    Publication date: February 16, 2017
    Inventors: Dominic Philip Haine, Maria Clarisse Gatchalian